Oct 08 20:44:38 crc systemd[1]: Starting Kubernetes Kubelet... Oct 08 20:44:38 crc restorecon[4668]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:38 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:39 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 20:44:40 crc restorecon[4668]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 08 20:44:41 crc kubenswrapper[4669]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 20:44:41 crc kubenswrapper[4669]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 08 20:44:41 crc kubenswrapper[4669]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 20:44:41 crc kubenswrapper[4669]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 20:44:41 crc kubenswrapper[4669]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 08 20:44:41 crc kubenswrapper[4669]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.055625 4669 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060744 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060766 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060774 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060780 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060786 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060792 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060797 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060803 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060809 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060814 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060819 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060824 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060830 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060835 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060841 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060848 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060854 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060869 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060875 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060880 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060886 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060891 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060897 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060903 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060908 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060913 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060918 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060924 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060929 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060934 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060940 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060946 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060952 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060958 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060965 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060971 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060977 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060983 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060988 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060994 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.060999 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061004 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061009 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061015 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061020 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061027 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061036 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061044 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061051 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061057 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061063 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061068 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061074 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061079 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061085 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061090 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061095 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061104 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061111 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061117 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061123 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061129 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061134 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061139 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061144 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061150 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061155 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061160 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061165 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061170 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.061175 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062273 4669 flags.go:64] FLAG: --address="0.0.0.0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062289 4669 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062301 4669 flags.go:64] FLAG: --anonymous-auth="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062321 4669 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062331 4669 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062338 4669 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062346 4669 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062354 4669 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062360 4669 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062367 4669 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062374 4669 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062381 4669 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062387 4669 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062393 4669 flags.go:64] FLAG: --cgroup-root="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062399 4669 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062405 4669 flags.go:64] FLAG: --client-ca-file="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062411 4669 flags.go:64] FLAG: --cloud-config="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062416 4669 flags.go:64] FLAG: --cloud-provider="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062423 4669 flags.go:64] FLAG: --cluster-dns="[]" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062794 4669 flags.go:64] FLAG: --cluster-domain="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062804 4669 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062812 4669 flags.go:64] FLAG: --config-dir="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062819 4669 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062827 4669 flags.go:64] FLAG: --container-log-max-files="5" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062837 4669 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062845 4669 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062853 4669 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062861 4669 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062868 4669 flags.go:64] FLAG: --contention-profiling="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062875 4669 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062882 4669 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062890 4669 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062899 4669 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062907 4669 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062915 4669 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062922 4669 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062929 4669 flags.go:64] FLAG: --enable-load-reader="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062937 4669 flags.go:64] FLAG: --enable-server="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062945 4669 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062957 4669 flags.go:64] FLAG: --event-burst="100" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062966 4669 flags.go:64] FLAG: --event-qps="50" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062974 4669 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062981 4669 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062988 4669 flags.go:64] FLAG: --eviction-hard="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.062997 4669 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063004 4669 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063012 4669 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063020 4669 flags.go:64] FLAG: --eviction-soft="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063027 4669 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063037 4669 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063044 4669 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063051 4669 flags.go:64] FLAG: --experimental-mounter-path="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063058 4669 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063065 4669 flags.go:64] FLAG: --fail-swap-on="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063072 4669 flags.go:64] FLAG: --feature-gates="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063081 4669 flags.go:64] FLAG: --file-check-frequency="20s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063088 4669 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063095 4669 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063103 4669 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063110 4669 flags.go:64] FLAG: --healthz-port="10248" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063118 4669 flags.go:64] FLAG: --help="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063126 4669 flags.go:64] FLAG: --hostname-override="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063133 4669 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063141 4669 flags.go:64] FLAG: --http-check-frequency="20s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063148 4669 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063155 4669 flags.go:64] FLAG: --image-credential-provider-config="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063162 4669 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063169 4669 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063176 4669 flags.go:64] FLAG: --image-service-endpoint="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063183 4669 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063190 4669 flags.go:64] FLAG: --kube-api-burst="100" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063197 4669 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063205 4669 flags.go:64] FLAG: --kube-api-qps="50" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063212 4669 flags.go:64] FLAG: --kube-reserved="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063219 4669 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063226 4669 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063235 4669 flags.go:64] FLAG: --kubelet-cgroups="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063242 4669 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063250 4669 flags.go:64] FLAG: --lock-file="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063257 4669 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063264 4669 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063272 4669 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063283 4669 flags.go:64] FLAG: --log-json-split-stream="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063291 4669 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063298 4669 flags.go:64] FLAG: --log-text-split-stream="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063305 4669 flags.go:64] FLAG: --logging-format="text" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063312 4669 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063319 4669 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063327 4669 flags.go:64] FLAG: --manifest-url="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063333 4669 flags.go:64] FLAG: --manifest-url-header="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063342 4669 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063350 4669 flags.go:64] FLAG: --max-open-files="1000000" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063359 4669 flags.go:64] FLAG: --max-pods="110" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063366 4669 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063375 4669 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063382 4669 flags.go:64] FLAG: --memory-manager-policy="None" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063390 4669 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063397 4669 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063404 4669 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063412 4669 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063429 4669 flags.go:64] FLAG: --node-status-max-images="50" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063436 4669 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063443 4669 flags.go:64] FLAG: --oom-score-adj="-999" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063450 4669 flags.go:64] FLAG: --pod-cidr="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063458 4669 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063469 4669 flags.go:64] FLAG: --pod-manifest-path="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063476 4669 flags.go:64] FLAG: --pod-max-pids="-1" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063484 4669 flags.go:64] FLAG: --pods-per-core="0" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063491 4669 flags.go:64] FLAG: --port="10250" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063499 4669 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063507 4669 flags.go:64] FLAG: --provider-id="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063514 4669 flags.go:64] FLAG: --qos-reserved="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063522 4669 flags.go:64] FLAG: --read-only-port="10255" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063555 4669 flags.go:64] FLAG: --register-node="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063564 4669 flags.go:64] FLAG: --register-schedulable="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063571 4669 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063584 4669 flags.go:64] FLAG: --registry-burst="10" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063591 4669 flags.go:64] FLAG: --registry-qps="5" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063598 4669 flags.go:64] FLAG: --reserved-cpus="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063606 4669 flags.go:64] FLAG: --reserved-memory="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063615 4669 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063622 4669 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063630 4669 flags.go:64] FLAG: --rotate-certificates="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063637 4669 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063645 4669 flags.go:64] FLAG: --runonce="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063652 4669 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063659 4669 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063667 4669 flags.go:64] FLAG: --seccomp-default="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063674 4669 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063681 4669 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063689 4669 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063696 4669 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063704 4669 flags.go:64] FLAG: --storage-driver-password="root" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063711 4669 flags.go:64] FLAG: --storage-driver-secure="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063718 4669 flags.go:64] FLAG: --storage-driver-table="stats" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063725 4669 flags.go:64] FLAG: --storage-driver-user="root" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063732 4669 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063739 4669 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063747 4669 flags.go:64] FLAG: --system-cgroups="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063754 4669 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063766 4669 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063773 4669 flags.go:64] FLAG: --tls-cert-file="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063780 4669 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063789 4669 flags.go:64] FLAG: --tls-min-version="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063796 4669 flags.go:64] FLAG: --tls-private-key-file="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063804 4669 flags.go:64] FLAG: --topology-manager-policy="none" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063813 4669 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063820 4669 flags.go:64] FLAG: --topology-manager-scope="container" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063829 4669 flags.go:64] FLAG: --v="2" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063839 4669 flags.go:64] FLAG: --version="false" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063850 4669 flags.go:64] FLAG: --vmodule="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063858 4669 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.063866 4669 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064064 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064075 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064083 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064090 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064096 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064103 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064110 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064116 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064123 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064130 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064136 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064143 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064149 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064156 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064163 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064171 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064177 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064184 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064190 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064197 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064206 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064215 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064222 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064232 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064240 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064248 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064257 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064266 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064275 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064282 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064289 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064298 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064305 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064312 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064319 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064326 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064332 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064339 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064345 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064352 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064358 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064365 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064372 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064378 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064385 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064391 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064398 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064405 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064411 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064419 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064425 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064432 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064438 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064445 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064451 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064458 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064465 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064473 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064479 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064485 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064492 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064497 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064504 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064510 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064516 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064553 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064561 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064570 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064576 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064583 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.064590 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.064609 4669 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.076938 4669 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.076971 4669 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077052 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077063 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077069 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077074 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077080 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077086 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077091 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077097 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077102 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077107 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077112 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077118 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077123 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077128 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077133 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077139 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077144 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077149 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077154 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077161 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077170 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077177 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077183 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077189 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077195 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077201 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077208 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077215 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077221 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077227 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077233 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077239 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077245 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077251 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077257 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077263 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077269 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077274 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077279 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077284 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077290 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077295 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077301 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077306 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077311 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077316 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077321 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077327 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077333 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077340 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077345 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077351 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077357 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077362 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077367 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077372 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077378 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077383 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077388 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077393 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077398 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077403 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077409 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077414 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077419 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077424 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077430 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077435 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077440 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077446 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077452 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.077460 4669 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077634 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077647 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077654 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077661 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077668 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077675 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077681 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077686 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077698 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077704 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077709 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077715 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077721 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077749 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077756 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077762 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077769 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077778 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077784 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077789 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077795 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077800 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077805 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077812 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077819 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077826 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077832 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077837 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077843 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077848 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077854 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077860 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077865 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077870 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077876 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077882 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077887 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077892 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077898 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077903 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077908 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077913 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077920 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077926 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077932 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077937 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077942 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077947 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077952 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077958 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077963 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077969 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077976 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077981 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077987 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077992 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.077997 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078002 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078007 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078012 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078017 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078023 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078028 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078034 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078040 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078047 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078053 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078058 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078064 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078070 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.078081 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.078090 4669 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.078955 4669 server.go:940] "Client rotation is on, will bootstrap in background" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.083376 4669 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.083470 4669 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.084859 4669 server.go:997] "Starting client certificate rotation" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.084885 4669 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.085104 4669 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-15 23:22:28.674328365 +0000 UTC Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.085251 4669 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2378h37m47.589081583s for next certificate rotation Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.117492 4669 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.122361 4669 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.161929 4669 log.go:25] "Validated CRI v1 runtime API" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.203833 4669 log.go:25] "Validated CRI v1 image API" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.206742 4669 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.214664 4669 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-08-20-40-20-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.214720 4669 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.241049 4669 manager.go:217] Machine: {Timestamp:2025-10-08 20:44:41.238394684 +0000 UTC m=+0.931205427 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:527fa759-e25f-4fb3-8304-f30dbff0c847 BootID:cf950064-edbb-4bec-8a75-ab8d963fcdb3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:59:c2:c2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:59:c2:c2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:de:02:01 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a7:d7:1f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3e:1d:19 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:59:ad:9e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:1e:8f:06:0c:e0 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:f5:d6:12:46:5f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.241501 4669 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.241733 4669 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.243573 4669 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.243893 4669 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.243951 4669 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.244955 4669 topology_manager.go:138] "Creating topology manager with none policy" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.244987 4669 container_manager_linux.go:303] "Creating device plugin manager" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.245554 4669 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.245585 4669 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.245941 4669 state_mem.go:36] "Initialized new in-memory state store" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.246082 4669 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.252370 4669 kubelet.go:418] "Attempting to sync node with API server" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.252467 4669 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.252507 4669 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.252570 4669 kubelet.go:324] "Adding apiserver pod source" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.252597 4669 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.258194 4669 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.259508 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.259677 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.259700 4669 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.259653 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.259778 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.261500 4669 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263415 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263460 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263488 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263505 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263560 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263580 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263597 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263620 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263635 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263649 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263691 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263706 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.263749 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.264572 4669 server.go:1280] "Started kubelet" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.264572 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.268918 4669 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.268968 4669 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.270512 4669 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 08 20:44:41 crc systemd[1]: Started Kubernetes Kubelet. Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.276495 4669 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.276586 4669 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.276781 4669 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 23:23:07.419453145 +0000 UTC Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.276854 4669 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1490h38m26.142603759s for next certificate rotation Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.276921 4669 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.276959 4669 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.278649 4669 factory.go:55] Registering systemd factory Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.278685 4669 factory.go:221] Registration of the systemd container factory successfully Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.280961 4669 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.281240 4669 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.281424 4669 server.go:460] "Adding debug handlers to kubelet server" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.281598 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="200ms" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.282109 4669 factory.go:153] Registering CRI-O factory Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.282144 4669 factory.go:221] Registration of the crio container factory successfully Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.282245 4669 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.282295 4669 factory.go:103] Registering Raw factory Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.282318 4669 manager.go:1196] Started watching for new ooms in manager Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.281078 4669 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c9ee9c4fca78b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 20:44:41.264482187 +0000 UTC m=+0.957292890,LastTimestamp:2025-10-08 20:44:41.264482187 +0000 UTC m=+0.957292890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.282933 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.283018 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.283680 4669 manager.go:319] Starting recovery of all containers Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290856 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290902 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290914 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290925 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290936 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290946 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290957 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290967 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290979 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.290991 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291002 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291011 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291022 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291033 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291042 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291051 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291060 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291068 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291077 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291087 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291097 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291112 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291128 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291136 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291146 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291156 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291167 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291177 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291187 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291196 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291206 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291216 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291224 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291233 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291241 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291251 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291262 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291273 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291283 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291293 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291304 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291317 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291327 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291338 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291349 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291359 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291369 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291380 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291390 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291401 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291412 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291423 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291439 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291451 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291463 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291475 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291486 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291497 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291509 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291520 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291545 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291555 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291564 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291575 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291586 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291595 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291605 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291615 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291625 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291635 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291644 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291654 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291664 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291674 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291682 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291691 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291700 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291709 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291719 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291731 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291740 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291750 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291760 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291769 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291780 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291791 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291800 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291810 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291821 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291830 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291840 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291849 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291860 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291870 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291881 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291893 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291904 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291916 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291927 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291939 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291950 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291961 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291973 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.291984 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292003 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292014 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292027 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292039 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292052 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292064 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292075 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292088 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292103 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292114 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292125 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292136 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292146 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292158 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292169 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292181 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292212 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292226 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292239 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292251 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292269 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292282 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292295 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292311 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292324 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292336 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292348 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292359 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292370 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292382 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292392 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292403 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292414 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292425 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292436 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292446 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292457 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292467 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292478 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292489 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292500 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292511 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292522 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292632 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292644 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292655 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292666 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292677 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292688 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292698 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292709 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292721 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292734 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292744 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292754 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292766 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292779 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292790 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292801 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292812 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292823 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292834 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292847 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292857 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292869 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292878 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292911 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292923 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292935 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292947 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292959 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292972 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.292984 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297094 4669 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297185 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297221 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297239 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297255 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297276 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297293 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297315 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297331 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297347 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297370 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297386 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297405 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297442 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297593 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297696 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297729 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297777 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297807 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297849 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297880 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297913 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297950 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.297980 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298015 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298045 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298122 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298208 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298243 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298270 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298289 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298305 4669 reconstruct.go:97] "Volume reconstruction finished" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.298316 4669 reconciler.go:26] "Reconciler: start to sync state" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.306812 4669 manager.go:324] Recovery completed Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.323675 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.326966 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.327015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.326898 4669 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.327028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.328462 4669 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.328502 4669 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.328547 4669 state_mem.go:36] "Initialized new in-memory state store" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.329397 4669 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.329457 4669 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.329492 4669 kubelet.go:2335] "Starting kubelet main sync loop" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.329680 4669 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.331429 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.331516 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.344938 4669 policy_none.go:49] "None policy: Start" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.346947 4669 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.346996 4669 state_mem.go:35] "Initializing new in-memory state store" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.381294 4669 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.398646 4669 manager.go:334] "Starting Device Plugin manager" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.399054 4669 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.399132 4669 server.go:79] "Starting device plugin registration server" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.399693 4669 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.399788 4669 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.399964 4669 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.400161 4669 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.400183 4669 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.410251 4669 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.430019 4669 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.430149 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.431511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.431652 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.431685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.432000 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.432380 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.432458 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.433910 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.434108 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.434168 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435085 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435118 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435151 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435183 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435329 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435634 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.435674 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437412 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437435 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437597 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.437833 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.438008 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.438099 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.438502 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.438585 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.438596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.438966 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.439079 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.440290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.440317 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.440328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.440508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.440543 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.440555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.482344 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="400ms" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.500992 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501161 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501234 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501353 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501446 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501498 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501773 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501852 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.501900 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.502000 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.502087 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.502125 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.502153 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.502184 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.502484 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.503330 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.503382 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.503397 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.503425 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.503777 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604222 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604401 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604428 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604459 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604484 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604492 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604562 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604594 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604654 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604610 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604502 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604642 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604628 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604758 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604803 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604821 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604838 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604853 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604882 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604896 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604928 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.604974 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.605009 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.605050 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.605087 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.605124 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.605162 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.605197 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.704493 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.705856 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.705907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.705921 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.705986 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.706405 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.753850 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.759784 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.782246 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.798283 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: I1008 20:44:41.806818 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.820202 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e268e5011f073d1101c0b4c459173ed69251f6aa440a96adb6e090e8490f547b WatchSource:0}: Error finding container e268e5011f073d1101c0b4c459173ed69251f6aa440a96adb6e090e8490f547b: Status 404 returned error can't find the container with id e268e5011f073d1101c0b4c459173ed69251f6aa440a96adb6e090e8490f547b Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.820592 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8c865a67aaa5643287a60f0a7f85d8a089b47a98bb46e78e77e80b79f588b499 WatchSource:0}: Error finding container 8c865a67aaa5643287a60f0a7f85d8a089b47a98bb46e78e77e80b79f588b499: Status 404 returned error can't find the container with id 8c865a67aaa5643287a60f0a7f85d8a089b47a98bb46e78e77e80b79f588b499 Oct 08 20:44:41 crc kubenswrapper[4669]: W1008 20:44:41.825429 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6dade29059d94d7bb4a0e1bd185b2453800f056175655dafa40f771e8942b3f4 WatchSource:0}: Error finding container 6dade29059d94d7bb4a0e1bd185b2453800f056175655dafa40f771e8942b3f4: Status 404 returned error can't find the container with id 6dade29059d94d7bb4a0e1bd185b2453800f056175655dafa40f771e8942b3f4 Oct 08 20:44:41 crc kubenswrapper[4669]: E1008 20:44:41.883209 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="800ms" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.106852 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.108312 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.108342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.108353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.108387 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.108800 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.266348 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:42 crc kubenswrapper[4669]: W1008 20:44:42.275838 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.275933 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:42 crc kubenswrapper[4669]: W1008 20:44:42.305265 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.305357 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.336241 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3661974ea5083a10942af4dde5480fc4fdc670e38cb63069bb65213f7673ed5f"} Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.337901 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6dade29059d94d7bb4a0e1bd185b2453800f056175655dafa40f771e8942b3f4"} Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.339483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df0a8ad6402fad5c93efe93c1a70734d296834de02710ae68b25ede2c280d1eb"} Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.340722 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8c865a67aaa5643287a60f0a7f85d8a089b47a98bb46e78e77e80b79f588b499"} Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.341622 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e268e5011f073d1101c0b4c459173ed69251f6aa440a96adb6e090e8490f547b"} Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.684447 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="1.6s" Oct 08 20:44:42 crc kubenswrapper[4669]: W1008 20:44:42.756051 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.756146 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:42 crc kubenswrapper[4669]: W1008 20:44:42.804269 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.804382 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.909643 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.911130 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.911170 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.911184 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:42 crc kubenswrapper[4669]: I1008 20:44:42.911212 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:44:42 crc kubenswrapper[4669]: E1008 20:44:42.911849 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.265241 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.346806 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14" exitCode=0 Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.346919 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14"} Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.347014 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.348215 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.348260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.348272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.349242 4669 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645" exitCode=0 Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.349329 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.349365 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645"} Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.350374 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351164 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351194 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351174 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351246 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351205 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351845 4669 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c68ec709284c53e463217c42296375acbe49c5cc9f3d9248f85bf59e1fe55f5d" exitCode=0 Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351922 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c68ec709284c53e463217c42296375acbe49c5cc9f3d9248f85bf59e1fe55f5d"} Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.351957 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.353266 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.353290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.353299 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.354902 4669 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6" exitCode=0 Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.354986 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6"} Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.355119 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.356371 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.356408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.356430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.358460 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87"} Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.358893 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c"} Oct 08 20:44:43 crc kubenswrapper[4669]: I1008 20:44:43.358978 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.266166 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:44 crc kubenswrapper[4669]: E1008 20:44:44.285179 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="3.2s" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.364942 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.365011 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.366176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.366211 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.366220 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.368125 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.368156 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.369848 4669 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488" exitCode=0 Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.369928 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.370043 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.371065 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.371101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.371111 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.372725 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"53dea36f2d24de5cd6ea4ecea981514af7ab8f6b33ec55678b30d000ee19e113"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.372750 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.373560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.373590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.373601 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.377354 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.377386 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.377397 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef"} Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.512230 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.513701 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.513758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.513772 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:44 crc kubenswrapper[4669]: I1008 20:44:44.513803 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:44:44 crc kubenswrapper[4669]: E1008 20:44:44.514394 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 08 20:44:44 crc kubenswrapper[4669]: W1008 20:44:44.801181 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:44 crc kubenswrapper[4669]: E1008 20:44:44.801306 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:45 crc kubenswrapper[4669]: W1008 20:44:45.070253 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:45 crc kubenswrapper[4669]: E1008 20:44:45.070367 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:45 crc kubenswrapper[4669]: W1008 20:44:45.246192 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:45 crc kubenswrapper[4669]: E1008 20:44:45.246289 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.265435 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:45 crc kubenswrapper[4669]: W1008 20:44:45.304103 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 08 20:44:45 crc kubenswrapper[4669]: E1008 20:44:45.304215 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.381861 4669 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992" exitCode=0 Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.381938 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992"} Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.382061 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.382960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.382992 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.383004 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386260 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a"} Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386303 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0"} Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386314 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1"} Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386352 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386400 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386446 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.386365 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389502 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389627 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.389509 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.390294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.390312 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.390354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.390377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.390390 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.899933 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:45 crc kubenswrapper[4669]: I1008 20:44:45.947339 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.299637 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.395188 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece"} Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.395258 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.395369 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.395455 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.395259 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0"} Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.395599 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5"} Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.396796 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.396837 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.396850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.396825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.396899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:46 crc kubenswrapper[4669]: I1008 20:44:46.396989 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.404259 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409"} Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.404323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae"} Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.404328 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.404502 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.404502 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.404778 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.407511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.407609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.407632 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.408316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.408381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.408420 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.411198 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.411233 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.411245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.715252 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.717147 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.717376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.717630 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.717841 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:44:47 crc kubenswrapper[4669]: I1008 20:44:47.857644 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.004817 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.407096 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.407147 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.408607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.408651 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.408609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.408739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.408668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:48 crc kubenswrapper[4669]: I1008 20:44:48.408765 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:49 crc kubenswrapper[4669]: I1008 20:44:49.409482 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:49 crc kubenswrapper[4669]: I1008 20:44:49.410401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:49 crc kubenswrapper[4669]: I1008 20:44:49.410425 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:49 crc kubenswrapper[4669]: I1008 20:44:49.410433 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.790886 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.791107 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.792416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.792452 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.792466 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.815939 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.816248 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.817975 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.818028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.818040 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:50 crc kubenswrapper[4669]: I1008 20:44:50.825036 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:51 crc kubenswrapper[4669]: E1008 20:44:51.410356 4669 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 20:44:51 crc kubenswrapper[4669]: I1008 20:44:51.414367 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:51 crc kubenswrapper[4669]: I1008 20:44:51.415238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:51 crc kubenswrapper[4669]: I1008 20:44:51.415273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:51 crc kubenswrapper[4669]: I1008 20:44:51.415289 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.290876 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.291143 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.292555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.292625 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.292645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.877690 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.877966 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.879765 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.879817 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.879831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:52 crc kubenswrapper[4669]: I1008 20:44:52.882660 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:44:53 crc kubenswrapper[4669]: I1008 20:44:53.420966 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:53 crc kubenswrapper[4669]: I1008 20:44:53.423398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:53 crc kubenswrapper[4669]: I1008 20:44:53.423459 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:53 crc kubenswrapper[4669]: I1008 20:44:53.423478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:54 crc kubenswrapper[4669]: I1008 20:44:54.802051 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:44:54 crc kubenswrapper[4669]: I1008 20:44:54.802333 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:54 crc kubenswrapper[4669]: I1008 20:44:54.804312 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:54 crc kubenswrapper[4669]: I1008 20:44:54.804377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:54 crc kubenswrapper[4669]: I1008 20:44:54.804404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:55 crc kubenswrapper[4669]: I1008 20:44:55.877738 4669 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 20:44:55 crc kubenswrapper[4669]: I1008 20:44:55.877825 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 20:44:55 crc kubenswrapper[4669]: I1008 20:44:55.947879 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 20:44:55 crc kubenswrapper[4669]: I1008 20:44:55.947995 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.019709 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52396->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.019791 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52396->192.168.126.11:17697: read: connection reset by peer" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.266511 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.429661 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.431842 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a" exitCode=255 Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.431887 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a"} Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.432058 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.432949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.432986 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.432996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:56 crc kubenswrapper[4669]: I1008 20:44:56.433663 4669 scope.go:117] "RemoveContainer" containerID="fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a" Oct 08 20:44:57 crc kubenswrapper[4669]: E1008 20:44:57.054089 4669 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.186c9ee9c4fca78b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 20:44:41.264482187 +0000 UTC m=+0.957292890,LastTimestamp:2025-10-08 20:44:41.264482187 +0000 UTC m=+0.957292890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.229280 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.229344 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.438983 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.440958 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec"} Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.441281 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.442380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.442430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.442445 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:44:57 crc kubenswrapper[4669]: I1008 20:44:57.858132 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:44:58 crc kubenswrapper[4669]: I1008 20:44:58.443196 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:44:58 crc kubenswrapper[4669]: I1008 20:44:58.444209 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:44:58 crc kubenswrapper[4669]: I1008 20:44:58.444268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:44:58 crc kubenswrapper[4669]: I1008 20:44:58.444287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:00 crc kubenswrapper[4669]: I1008 20:45:00.952754 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:45:00 crc kubenswrapper[4669]: I1008 20:45:00.952931 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:45:00 crc kubenswrapper[4669]: I1008 20:45:00.954039 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:00 crc kubenswrapper[4669]: I1008 20:45:00.954102 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:00 crc kubenswrapper[4669]: I1008 20:45:00.954117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:00 crc kubenswrapper[4669]: I1008 20:45:00.959629 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:45:01 crc kubenswrapper[4669]: E1008 20:45:01.410487 4669 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 20:45:01 crc kubenswrapper[4669]: I1008 20:45:01.449579 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:45:01 crc kubenswrapper[4669]: I1008 20:45:01.450383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:01 crc kubenswrapper[4669]: I1008 20:45:01.450444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:01 crc kubenswrapper[4669]: I1008 20:45:01.450463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.217807 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.220511 4669 trace.go:236] Trace[1136753190]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 20:44:49.404) (total time: 12815ms): Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[1136753190]: ---"Objects listed" error: 12815ms (20:45:02.220) Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[1136753190]: [12.815963914s] [12.815963914s] END Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.220564 4669 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.221214 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.221644 4669 trace.go:236] Trace[858456421]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 20:44:50.654) (total time: 11566ms): Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[858456421]: ---"Objects listed" error: 11566ms (20:45:02.221) Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[858456421]: [11.566862494s] [11.566862494s] END Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.221682 4669 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.222887 4669 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.222968 4669 trace.go:236] Trace[1322344252]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 20:44:49.181) (total time: 13041ms): Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[1322344252]: ---"Objects listed" error: 13041ms (20:45:02.222) Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[1322344252]: [13.041261729s] [13.041261729s] END Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.223001 4669 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.223203 4669 trace.go:236] Trace[1308925734]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 20:44:48.728) (total time: 13494ms): Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[1308925734]: ---"Objects listed" error: 13494ms (20:45:02.223) Oct 08 20:45:02 crc kubenswrapper[4669]: Trace[1308925734]: [13.494549258s] [13.494549258s] END Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.223224 4669 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.266195 4669 apiserver.go:52] "Watching apiserver" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.271223 4669 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.271567 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.272151 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.272429 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.272511 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.272588 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.272545 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.272778 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.272778 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.272811 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.272957 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.274023 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.275348 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.275497 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.275806 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.275854 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.275972 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.276329 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.276518 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.276582 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.281950 4669 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323838 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323879 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323897 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323921 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323945 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323962 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323979 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.323994 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324010 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324120 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324140 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324157 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324175 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324216 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324230 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324244 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324262 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324277 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324296 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324315 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324348 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324363 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324378 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324394 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324410 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324397 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324427 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324499 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324521 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324552 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324572 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324589 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324621 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324643 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324669 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324695 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324717 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324740 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324817 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324848 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324888 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324912 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324937 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324960 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.324981 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325001 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325024 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325046 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325066 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325094 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325268 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325611 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325658 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325642 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325741 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325771 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325809 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325835 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325862 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325887 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325905 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325911 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325981 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.325990 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326020 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326054 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326088 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326124 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326131 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326159 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326167 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326192 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326219 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326243 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326272 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326284 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326299 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326362 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326386 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326410 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326435 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326458 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326494 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326499 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326595 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326624 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326652 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326683 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326688 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326711 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326782 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326813 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326870 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326905 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326936 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.326988 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327014 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327037 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327043 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327060 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327088 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327111 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327136 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327161 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327192 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327216 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327219 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327241 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327270 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327297 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327322 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327344 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327371 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327415 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327437 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327441 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327460 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327483 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327504 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327545 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327571 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327595 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327617 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327640 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327663 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327685 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327696 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327709 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327769 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327798 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327828 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327857 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327892 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327925 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327948 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.327979 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328009 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328035 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328061 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328062 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328092 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328122 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328147 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328198 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328224 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328249 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328276 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328301 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328322 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328328 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328382 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328403 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328425 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328444 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328463 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328480 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328500 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328517 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328549 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328587 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328605 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328622 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328659 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328683 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328755 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328773 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328790 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328821 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328837 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328855 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328872 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328892 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328909 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328926 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328945 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328961 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.328991 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329013 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329035 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329057 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329076 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329094 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329109 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329127 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329144 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329161 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329202 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329223 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329240 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329260 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329281 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329297 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329315 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329331 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329348 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329368 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329382 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329388 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329444 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329479 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329512 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329559 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329586 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329610 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329642 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329669 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329693 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.329825 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.330209 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.332586 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.333311 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.333506 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.333784 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.333992 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.334263 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.334469 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.334924 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335122 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335289 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335316 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335571 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335638 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335674 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335754 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335769 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335835 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335880 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335955 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.335995 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336024 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336058 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336091 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336116 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336143 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336207 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336240 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336267 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336319 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336347 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336346 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336373 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336477 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336498 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336516 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336550 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336566 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336580 4669 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336596 4669 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336613 4669 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336630 4669 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336643 4669 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336657 4669 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336673 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336687 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336701 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336716 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336733 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336748 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336764 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336780 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336782 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336799 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337025 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337038 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337280 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337570 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337955 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337972 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.337985 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.338021 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.338155 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.338267 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.338319 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.338638 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.338650 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.344931 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.345371 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.345658 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.346270 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.346772 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.347092 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.348208 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.348834 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.352864 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.353025 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:45:02.852924389 +0000 UTC m=+22.545735062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.353933 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.354162 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.354587 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.354644 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.354785 4669 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.355459 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.355508 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.356011 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.356081 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.356137 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.356646 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.356709 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.357030 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.357293 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.357437 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.358897 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.358959 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.359206 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.359359 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.359500 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.359511 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.359543 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.359611 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.360009 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.360666 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.360697 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.360901 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.361380 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.336795 4669 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.361633 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.361698 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.361935 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362225 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362259 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362323 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362341 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362670 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362749 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362775 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362676 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362801 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.363055 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.362848 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.363236 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.363351 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.363471 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.363482 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.364255 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.364377 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.364612 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.364853 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.364855 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.365109 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.365195 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.365337 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.365703 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.365769 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.365860 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.366089 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.366580 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.366930 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.367137 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.367292 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.367723 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.367754 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.368344 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.368600 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.372225 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.374051 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:02.874031161 +0000 UTC m=+22.566841824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.373416 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.372881 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.372808 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.372839 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.373073 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.374206 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.374244 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.374410 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.373096 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.373093 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.373954 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.375334 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.375562 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.375926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.376896 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.377765 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.377886 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.377951 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.378044 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:02.878006509 +0000 UTC m=+22.570817402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.378658 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.379326 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.379663 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.379807 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.379926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.380022 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.380357 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.380453 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.380851 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.381102 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.381318 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.381685 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382016 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382468 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382513 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382652 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382684 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382817 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382869 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.382880 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.383366 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.383395 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.383519 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.383622 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.383846 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.383868 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.384730 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.385033 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.385115 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.386522 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.386891 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.387313 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.387978 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.387999 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.388107 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:02.888073125 +0000 UTC m=+22.580884008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.387662 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.387759 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.389446 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.390575 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.392043 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.393986 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.394323 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.394365 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.394383 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.394446 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:02.894424202 +0000 UTC m=+22.587235085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396136 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396191 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396514 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396829 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396858 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396883 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.396914 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.397051 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.397117 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.397168 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.397192 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.397394 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.398430 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.398865 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.399028 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.399151 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.399078 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.400691 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.402228 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.403389 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.404137 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.406924 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.413922 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.414063 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.415145 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.418755 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.423820 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.424545 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.441104 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.441969 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.455249 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.459763 4669 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.462336 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.462457 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.462566 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463320 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463680 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463731 4669 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463775 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463820 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463839 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463855 4669 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463865 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463874 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463884 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463895 4669 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463904 4669 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463913 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463922 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463932 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463942 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463953 4669 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463973 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463981 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463989 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.463998 4669 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464010 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464019 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464027 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464036 4669 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464047 4669 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464055 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464064 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464075 4669 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464083 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464091 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464100 4669 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464110 4669 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464119 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464134 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464142 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464153 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464163 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464172 4669 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464181 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464193 4669 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464202 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464211 4669 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464223 4669 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464234 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464371 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464423 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464436 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464446 4669 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464461 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464471 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464480 4669 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464489 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464502 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464511 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464520 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464545 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464614 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464666 4669 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464681 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464702 4669 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464716 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464729 4669 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464743 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464760 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464773 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464789 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464803 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464820 4669 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464833 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464846 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464863 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464875 4669 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464896 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464909 4669 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464926 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464939 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464953 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464967 4669 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.464984 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465001 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465016 4669 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465029 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465045 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465057 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465070 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465086 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465099 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465111 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465124 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465142 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465157 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465169 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465182 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465198 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465211 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465224 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465238 4669 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465254 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465268 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465280 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465296 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465307 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465318 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465330 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465344 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465355 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465366 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465378 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465392 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465405 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465419 4669 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465436 4669 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465448 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465461 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465472 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465488 4669 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465501 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465512 4669 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465524 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465559 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465570 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465583 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465596 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465611 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465623 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465635 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465648 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465658 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465668 4669 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465680 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465695 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465707 4669 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465721 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465735 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465791 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465820 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465841 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465855 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465871 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465886 4669 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465899 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465912 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465928 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465961 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465974 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.465990 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466003 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466014 4669 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466027 4669 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466041 4669 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466054 4669 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466067 4669 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466080 4669 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466096 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466108 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466120 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466133 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466150 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466162 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466175 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466190 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466203 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466216 4669 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466228 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466246 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466260 4669 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466275 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466289 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466305 4669 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466317 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466329 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466345 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466359 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.466371 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.467966 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.478193 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.488845 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.587517 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.596618 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.601184 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 20:45:02 crc kubenswrapper[4669]: W1008 20:45:02.604406 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-14d2d5ef46fab23544e35465f778de301a0e9f3c35b2351e35faf54ef276c3bc WatchSource:0}: Error finding container 14d2d5ef46fab23544e35465f778de301a0e9f3c35b2351e35faf54ef276c3bc: Status 404 returned error can't find the container with id 14d2d5ef46fab23544e35465f778de301a0e9f3c35b2351e35faf54ef276c3bc Oct 08 20:45:02 crc kubenswrapper[4669]: W1008 20:45:02.611467 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-85a18f69eeca0ac88bdb285fe0cb2f106ee25f55ae2725870e48f11d70e017ae WatchSource:0}: Error finding container 85a18f69eeca0ac88bdb285fe0cb2f106ee25f55ae2725870e48f11d70e017ae: Status 404 returned error can't find the container with id 85a18f69eeca0ac88bdb285fe0cb2f106ee25f55ae2725870e48f11d70e017ae Oct 08 20:45:02 crc kubenswrapper[4669]: W1008 20:45:02.627591 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-f8b4c918f9bfc48dd640e616930d0bf74244459993cca343da847a1d09fca781 WatchSource:0}: Error finding container f8b4c918f9bfc48dd640e616930d0bf74244459993cca343da847a1d09fca781: Status 404 returned error can't find the container with id f8b4c918f9bfc48dd640e616930d0bf74244459993cca343da847a1d09fca781 Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.869613 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.869844 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:45:03.869807493 +0000 UTC m=+23.562618166 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.883034 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.887006 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.893947 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.904245 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.917621 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.929439 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.932740 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.943718 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.955777 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.970515 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.970651 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.970683 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970687 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970787 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:03.970763911 +0000 UTC m=+23.663574584 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970801 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970821 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970833 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970899 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:03.970878564 +0000 UTC m=+23.663689267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970907 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970928 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970942 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970965 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970976 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:03.970967797 +0000 UTC m=+23.663778590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.970702 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:02 crc kubenswrapper[4669]: E1008 20:45:02.970994 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:03.970985707 +0000 UTC m=+23.663796380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.973562 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:02 crc kubenswrapper[4669]: I1008 20:45:02.983665 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.001517 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.011865 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.021221 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.032604 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.044377 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.055148 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.065803 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.334506 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.335182 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.336855 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.337630 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.338855 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.339581 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.340376 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.341629 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.342339 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.343612 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.344334 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.345764 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.346421 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.347095 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.349153 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.349838 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.351506 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.352081 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.352674 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.353439 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.353926 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.354451 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.354914 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.355517 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.355983 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.356793 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.357371 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.357840 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.358417 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.358914 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.359358 4669 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.359455 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.364125 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.364760 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.365128 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.366570 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.367542 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.368045 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.369037 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.369673 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.370506 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.371114 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.372262 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.372832 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.373657 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.374153 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.374980 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.375747 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.376662 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.377148 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.378003 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.378584 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.379172 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.380177 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.455185 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.455722 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.457647 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec" exitCode=255 Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.457704 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.457778 4669 scope.go:117] "RemoveContainer" containerID="fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.459810 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.459856 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.459878 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f8b4c918f9bfc48dd640e616930d0bf74244459993cca343da847a1d09fca781"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.460701 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"85a18f69eeca0ac88bdb285fe0cb2f106ee25f55ae2725870e48f11d70e017ae"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.462098 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.462118 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"14d2d5ef46fab23544e35465f778de301a0e9f3c35b2351e35faf54ef276c3bc"} Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.477450 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.490488 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.504430 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.518384 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.518426 4669 scope.go:117] "RemoveContainer" containerID="8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec" Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.518645 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.519733 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.535667 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.553407 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.566482 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.587280 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.599568 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.612366 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.613395 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zcf2d"] Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.613787 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.616162 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.616517 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.616920 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.625706 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.638946 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.657759 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.672202 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:44:56Z\\\",\\\"message\\\":\\\"W1008 20:44:45.245390 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:45.246561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956285 cert, and key in /tmp/serving-cert-3157989699/serving-signer.crt, /tmp/serving-cert-3157989699/serving-signer.key\\\\nI1008 20:44:45.581654 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:44:45.584661 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 20:44:45.585560 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:44:45.588793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3157989699/tls.crt::/tmp/serving-cert-3157989699/tls.key\\\\\\\"\\\\nF1008 20:44:56.015244 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.684997 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.698036 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.711488 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.747745 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.779412 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsl6\" (UniqueName: \"kubernetes.io/projected/a016bee1-2c29-46bb-b3b8-841c4a65e162-kube-api-access-flsl6\") pod \"node-resolver-zcf2d\" (UID: \"a016bee1-2c29-46bb-b3b8-841c4a65e162\") " pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.779499 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a016bee1-2c29-46bb-b3b8-841c4a65e162-hosts-file\") pod \"node-resolver-zcf2d\" (UID: \"a016bee1-2c29-46bb-b3b8-841c4a65e162\") " pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.787951 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:44:56Z\\\",\\\"message\\\":\\\"W1008 20:44:45.245390 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:45.246561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956285 cert, and key in /tmp/serving-cert-3157989699/serving-signer.crt, /tmp/serving-cert-3157989699/serving-signer.key\\\\nI1008 20:44:45.581654 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:44:45.584661 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 20:44:45.585560 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:44:45.588793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3157989699/tls.crt::/tmp/serving-cert-3157989699/tls.key\\\\\\\"\\\\nF1008 20:44:56.015244 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.811468 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.823516 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.838674 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.853077 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.867278 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.878318 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.880129 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.880196 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flsl6\" (UniqueName: \"kubernetes.io/projected/a016bee1-2c29-46bb-b3b8-841c4a65e162-kube-api-access-flsl6\") pod \"node-resolver-zcf2d\" (UID: \"a016bee1-2c29-46bb-b3b8-841c4a65e162\") " pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.880273 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a016bee1-2c29-46bb-b3b8-841c4a65e162-hosts-file\") pod \"node-resolver-zcf2d\" (UID: \"a016bee1-2c29-46bb-b3b8-841c4a65e162\") " pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.880345 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a016bee1-2c29-46bb-b3b8-841c4a65e162-hosts-file\") pod \"node-resolver-zcf2d\" (UID: \"a016bee1-2c29-46bb-b3b8-841c4a65e162\") " pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.880360 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:45:05.880333198 +0000 UTC m=+25.573143871 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.889340 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.895238 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsl6\" (UniqueName: \"kubernetes.io/projected/a016bee1-2c29-46bb-b3b8-841c4a65e162-kube-api-access-flsl6\") pod \"node-resolver-zcf2d\" (UID: \"a016bee1-2c29-46bb-b3b8-841c4a65e162\") " pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.902656 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:03Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.925284 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zcf2d" Oct 08 20:45:03 crc kubenswrapper[4669]: W1008 20:45:03.937107 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda016bee1_2c29_46bb_b3b8_841c4a65e162.slice/crio-12697f65b06abf476aa3b146ade1cb7e0555cc6956091fdbbe5350a3fd557ac8 WatchSource:0}: Error finding container 12697f65b06abf476aa3b146ade1cb7e0555cc6956091fdbbe5350a3fd557ac8: Status 404 returned error can't find the container with id 12697f65b06abf476aa3b146ade1cb7e0555cc6956091fdbbe5350a3fd557ac8 Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.981235 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.981282 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981406 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981424 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981450 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981458 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:05.98144543 +0000 UTC m=+25.674256103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981464 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981506 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:05.981489971 +0000 UTC m=+25.674300664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981676 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981717 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981737 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.981815 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:05.98179171 +0000 UTC m=+25.674602433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.982714 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:03 crc kubenswrapper[4669]: I1008 20:45:03.982766 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.982924 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:03 crc kubenswrapper[4669]: E1008 20:45:03.982987 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:05.982972535 +0000 UTC m=+25.675783278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.006502 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bfcvh"] Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.007183 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hw2kf"] Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.007410 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.007927 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.011820 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.011823 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.011907 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012002 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpzdw"] Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012032 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012053 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012140 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012194 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012229 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012250 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012568 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.012866 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-klx9r"] Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.013086 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.013288 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016521 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016539 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016685 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016700 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016800 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016859 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.016908 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.017098 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.017218 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.027264 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.040226 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.062442 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.078081 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:44:56Z\\\",\\\"message\\\":\\\"W1008 20:44:45.245390 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:45.246561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956285 cert, and key in /tmp/serving-cert-3157989699/serving-signer.crt, /tmp/serving-cert-3157989699/serving-signer.key\\\\nI1008 20:44:45.581654 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:44:45.584661 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 20:44:45.585560 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:44:45.588793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3157989699/tls.crt::/tmp/serving-cert-3157989699/tls.key\\\\\\\"\\\\nF1008 20:44:56.015244 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.093336 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.105966 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.116784 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.133690 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.148013 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.162685 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.176440 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184769 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2433400c-98f8-490f-a566-00a330a738fe-multus-daemon-config\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184812 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39c9bcf2-9580-4534-8c7e-886bd4aff469-mcd-auth-proxy-config\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184842 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jwbv\" (UniqueName: \"kubernetes.io/projected/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-kube-api-access-4jwbv\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184863 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-log-socket\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184883 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-netd\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-os-release\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184940 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-node-log\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184963 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/39c9bcf2-9580-4534-8c7e-886bd4aff469-rootfs\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.184982 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-netns\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185002 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-var-lib-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185018 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2433400c-98f8-490f-a566-00a330a738fe-cni-binary-copy\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185035 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-k8s-cni-cncf-io\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-kubelet\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185151 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-os-release\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-systemd\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185229 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-ovn-kubernetes\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185257 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185282 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-cni-bin\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185339 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-cni-multus\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185387 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-conf-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185434 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185454 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-system-cni-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185470 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-cni-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185488 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-kubelet\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185509 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185547 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-bin\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185573 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-socket-dir-parent\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185610 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c9bcf2-9580-4534-8c7e-886bd4aff469-proxy-tls\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185629 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-system-cni-dir\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185645 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-etc-kubernetes\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185664 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185685 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185708 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovn-node-metrics-cert\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185729 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-netns\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185752 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwq5\" (UniqueName: \"kubernetes.io/projected/39c9bcf2-9580-4534-8c7e-886bd4aff469-kube-api-access-9vwq5\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185798 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-config\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185834 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-hostroot\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185883 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cnibin\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185903 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-slash\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185921 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-etc-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185939 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-cnibin\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185965 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-multus-certs\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.185985 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-ovn\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.186003 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-env-overrides\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.186026 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-systemd-units\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.186048 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-script-lib\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.186070 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqxk\" (UniqueName: \"kubernetes.io/projected/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-kube-api-access-4zqxk\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.186093 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkdcd\" (UniqueName: \"kubernetes.io/projected/2433400c-98f8-490f-a566-00a330a738fe-kube-api-access-gkdcd\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.192830 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:44:56Z\\\",\\\"message\\\":\\\"W1008 20:44:45.245390 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:45.246561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956285 cert, and key in /tmp/serving-cert-3157989699/serving-signer.crt, /tmp/serving-cert-3157989699/serving-signer.key\\\\nI1008 20:44:45.581654 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:44:45.584661 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 20:44:45.585560 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:44:45.588793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3157989699/tls.crt::/tmp/serving-cert-3157989699/tls.key\\\\\\\"\\\\nF1008 20:44:56.015244 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.207140 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.226305 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.238439 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.253067 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.271909 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.285589 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.286922 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-conf-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.286983 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287022 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-cni-bin\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287058 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-cni-multus\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287085 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287108 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-conf-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287113 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287160 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-cni-multus\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287178 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287192 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-system-cni-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287202 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-cni-bin\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287213 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-cni-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287304 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-kubelet\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287334 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287359 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-bin\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287365 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-system-cni-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287396 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-socket-dir-parent\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287423 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-kubelet\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287426 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c9bcf2-9580-4534-8c7e-886bd4aff469-proxy-tls\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287468 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-bin\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-system-cni-dir\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287568 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-system-cni-dir\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287539 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-socket-dir-parent\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287589 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-etc-kubernetes\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287623 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-etc-kubernetes\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287519 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-multus-cni-dir\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwq5\" (UniqueName: \"kubernetes.io/projected/39c9bcf2-9580-4534-8c7e-886bd4aff469-kube-api-access-9vwq5\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287773 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287803 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287830 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovn-node-metrics-cert\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287852 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-netns\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287880 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-config\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287903 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-hostroot\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287926 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-multus-certs\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287965 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cnibin\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.287988 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-slash\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288008 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-etc-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288028 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-cnibin\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288052 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-ovn\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288071 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-env-overrides\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288093 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkdcd\" (UniqueName: \"kubernetes.io/projected/2433400c-98f8-490f-a566-00a330a738fe-kube-api-access-gkdcd\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-systemd-units\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288137 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288141 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-script-lib\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288191 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqxk\" (UniqueName: \"kubernetes.io/projected/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-kube-api-access-4zqxk\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288215 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39c9bcf2-9580-4534-8c7e-886bd4aff469-mcd-auth-proxy-config\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288238 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2433400c-98f8-490f-a566-00a330a738fe-multus-daemon-config\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288275 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jwbv\" (UniqueName: \"kubernetes.io/projected/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-kube-api-access-4jwbv\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288299 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-log-socket\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288321 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-netd\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288341 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-os-release\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288369 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-node-log\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288392 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-kubelet\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288415 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/39c9bcf2-9580-4534-8c7e-886bd4aff469-rootfs\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-netns\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288496 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-var-lib-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288520 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2433400c-98f8-490f-a566-00a330a738fe-cni-binary-copy\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288572 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-k8s-cni-cncf-io\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288608 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-os-release\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288631 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-systemd\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288659 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-ovn-kubernetes\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288673 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-slash\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288842 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-script-lib\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288858 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-ovn\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288902 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-etc-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288950 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-cnibin\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288981 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/39c9bcf2-9580-4534-8c7e-886bd4aff469-rootfs\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289277 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-env-overrides\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289476 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-systemd-units\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289513 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39c9bcf2-9580-4534-8c7e-886bd4aff469-mcd-auth-proxy-config\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289561 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-hostroot\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289577 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-netns\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289587 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2433400c-98f8-490f-a566-00a330a738fe-multus-daemon-config\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289594 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-netns\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289606 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-var-lib-openvswitch\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289664 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289759 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-log-socket\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289783 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-netd\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289867 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-os-release\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289884 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-os-release\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289891 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-node-log\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289915 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-var-lib-kubelet\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289944 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-k8s-cni-cncf-io\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289948 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2433400c-98f8-490f-a566-00a330a738fe-host-run-multus-certs\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.289968 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-systemd\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.288629 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.290002 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-cnibin\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.290020 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2433400c-98f8-490f-a566-00a330a738fe-cni-binary-copy\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.290054 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-ovn-kubernetes\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.290143 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-config\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.291033 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c9bcf2-9580-4534-8c7e-886bd4aff469-proxy-tls\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.291201 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovn-node-metrics-cert\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.305516 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwq5\" (UniqueName: \"kubernetes.io/projected/39c9bcf2-9580-4534-8c7e-886bd4aff469-kube-api-access-9vwq5\") pod \"machine-config-daemon-hw2kf\" (UID: \"39c9bcf2-9580-4534-8c7e-886bd4aff469\") " pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.309732 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.310105 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkdcd\" (UniqueName: \"kubernetes.io/projected/2433400c-98f8-490f-a566-00a330a738fe-kube-api-access-gkdcd\") pod \"multus-klx9r\" (UID: \"2433400c-98f8-490f-a566-00a330a738fe\") " pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.310353 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqxk\" (UniqueName: \"kubernetes.io/projected/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-kube-api-access-4zqxk\") pod \"ovnkube-node-gpzdw\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.311027 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jwbv\" (UniqueName: \"kubernetes.io/projected/b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c-kube-api-access-4jwbv\") pod \"multus-additional-cni-plugins-bfcvh\" (UID: \"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\") " pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.323506 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.327578 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.330008 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:04 crc kubenswrapper[4669]: E1008 20:45:04.330184 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.330285 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:04 crc kubenswrapper[4669]: E1008 20:45:04.330405 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.330418 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:04 crc kubenswrapper[4669]: E1008 20:45:04.330595 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.337117 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.340498 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:45:04 crc kubenswrapper[4669]: W1008 20:45:04.342370 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d00f76_e9e7_4a09_9be0_7ad67d4e8c0c.slice/crio-b924e924e0235c96608a66552fa7b2c5cb860a2e1708d89fb9f951303969d44f WatchSource:0}: Error finding container b924e924e0235c96608a66552fa7b2c5cb860a2e1708d89fb9f951303969d44f: Status 404 returned error can't find the container with id b924e924e0235c96608a66552fa7b2c5cb860a2e1708d89fb9f951303969d44f Oct 08 20:45:04 crc kubenswrapper[4669]: W1008 20:45:04.351431 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c9bcf2_9580_4534_8c7e_886bd4aff469.slice/crio-886b684f712b9c714eb639c746fd5da581dc9a91db262bbaaa4fe220d541981b WatchSource:0}: Error finding container 886b684f712b9c714eb639c746fd5da581dc9a91db262bbaaa4fe220d541981b: Status 404 returned error can't find the container with id 886b684f712b9c714eb639c746fd5da581dc9a91db262bbaaa4fe220d541981b Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.351561 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.354558 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-klx9r" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.361466 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.366836 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: W1008 20:45:04.391958 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2433400c_98f8_490f_a566_00a330a738fe.slice/crio-e5869c0f776bf06b64d11043f168d491bde3de03c41346cbcf10b611dcea2553 WatchSource:0}: Error finding container e5869c0f776bf06b64d11043f168d491bde3de03c41346cbcf10b611dcea2553: Status 404 returned error can't find the container with id e5869c0f776bf06b64d11043f168d491bde3de03c41346cbcf10b611dcea2553 Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.392653 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.413857 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.472700 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zcf2d" event={"ID":"a016bee1-2c29-46bb-b3b8-841c4a65e162","Type":"ContainerStarted","Data":"0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410"} Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.472757 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zcf2d" event={"ID":"a016bee1-2c29-46bb-b3b8-841c4a65e162","Type":"ContainerStarted","Data":"12697f65b06abf476aa3b146ade1cb7e0555cc6956091fdbbe5350a3fd557ac8"} Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.474155 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"e9250f8f75d073de984775a81aaafd543292b62ba50216285cd6f0ae77ca9b8b"} Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.476013 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerStarted","Data":"e5869c0f776bf06b64d11043f168d491bde3de03c41346cbcf10b611dcea2553"} Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.484326 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.487376 4669 scope.go:117] "RemoveContainer" containerID="8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec" Oct 08 20:45:04 crc kubenswrapper[4669]: E1008 20:45:04.487558 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.488588 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"886b684f712b9c714eb639c746fd5da581dc9a91db262bbaaa4fe220d541981b"} Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.493512 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerStarted","Data":"b924e924e0235c96608a66552fa7b2c5cb860a2e1708d89fb9f951303969d44f"} Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.497790 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef8d760c2a5682066865c2449fc9e7672fb796f76a1d8be961753e6d380b92a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:44:56Z\\\",\\\"message\\\":\\\"W1008 20:44:45.245390 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:45.246561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956285 cert, and key in /tmp/serving-cert-3157989699/serving-signer.crt, /tmp/serving-cert-3157989699/serving-signer.key\\\\nI1008 20:44:45.581654 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:44:45.584661 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 20:44:45.585560 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:44:45.588793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3157989699/tls.crt::/tmp/serving-cert-3157989699/tls.key\\\\\\\"\\\\nF1008 20:44:56.015244 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.517886 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.539597 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.556198 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.573187 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.591566 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.637319 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.670369 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.710990 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.750171 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.791867 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.828206 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.870338 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.909626 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.972961 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:04 crc kubenswrapper[4669]: I1008 20:45:04.994610 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.028247 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.069709 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.111106 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.149585 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.194570 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.230497 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.270248 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.309384 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.353627 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.389574 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.428653 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.467941 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.497717 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c" containerID="bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600" exitCode=0 Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.497866 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerDied","Data":"bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600"} Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.499160 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" exitCode=0 Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.499190 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793"} Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.501404 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9"} Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.501510 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5"} Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.502956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerStarted","Data":"863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee"} Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.504573 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e"} Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.523552 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.548115 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.595373 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.629509 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.667327 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.711255 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.722081 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.722621 4669 scope.go:117] "RemoveContainer" containerID="8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec" Oct 08 20:45:05 crc kubenswrapper[4669]: E1008 20:45:05.722851 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.753129 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.789502 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.829927 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.856074 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-flswm"] Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.856437 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.869465 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.880713 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.899510 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.902089 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:05 crc kubenswrapper[4669]: E1008 20:45:05.902176 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:45:09.90215769 +0000 UTC m=+29.594968363 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.919672 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.939787 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 08 20:45:05 crc kubenswrapper[4669]: I1008 20:45:05.992853 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003405 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/609156f9-39b1-4330-83a2-eabf82f4228f-serviceca\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003480 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003545 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtz47\" (UniqueName: \"kubernetes.io/projected/609156f9-39b1-4330-83a2-eabf82f4228f-kube-api-access-rtz47\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003583 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/609156f9-39b1-4330-83a2-eabf82f4228f-host\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003624 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003671 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003702 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003750 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:10.003728495 +0000 UTC m=+29.696539268 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003798 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003823 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003844 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.003777 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003856 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003873 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:10.003852279 +0000 UTC m=+29.696663012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003948 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:10.003937631 +0000 UTC m=+29.696748294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003961 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.003985 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.004001 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.004059 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:10.004041974 +0000 UTC m=+29.696852717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.033977 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.070396 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.104730 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtz47\" (UniqueName: \"kubernetes.io/projected/609156f9-39b1-4330-83a2-eabf82f4228f-kube-api-access-rtz47\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.104767 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/609156f9-39b1-4330-83a2-eabf82f4228f-host\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.104808 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/609156f9-39b1-4330-83a2-eabf82f4228f-serviceca\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.104956 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/609156f9-39b1-4330-83a2-eabf82f4228f-host\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.105707 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/609156f9-39b1-4330-83a2-eabf82f4228f-serviceca\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.110133 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.140016 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtz47\" (UniqueName: \"kubernetes.io/projected/609156f9-39b1-4330-83a2-eabf82f4228f-kube-api-access-rtz47\") pod \"node-ca-flswm\" (UID: \"609156f9-39b1-4330-83a2-eabf82f4228f\") " pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.171295 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.207587 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.249620 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.288847 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.294835 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flswm" Oct 08 20:45:06 crc kubenswrapper[4669]: W1008 20:45:06.305206 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod609156f9_39b1_4330_83a2_eabf82f4228f.slice/crio-68382cb29ea021bd13d1b2a6228345411944ddca5b609b93224c65adf992ad4d WatchSource:0}: Error finding container 68382cb29ea021bd13d1b2a6228345411944ddca5b609b93224c65adf992ad4d: Status 404 returned error can't find the container with id 68382cb29ea021bd13d1b2a6228345411944ddca5b609b93224c65adf992ad4d Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.329756 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.329865 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.329913 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.330200 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.330236 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.330009 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:06 crc kubenswrapper[4669]: E1008 20:45:06.330466 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.386492 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.433202 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.476781 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.489224 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.511291 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.511339 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.511352 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.511363 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.511374 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.514050 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c" containerID="a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2" exitCode=0 Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.514113 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerDied","Data":"a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.515139 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flswm" event={"ID":"609156f9-39b1-4330-83a2-eabf82f4228f","Type":"ContainerStarted","Data":"68382cb29ea021bd13d1b2a6228345411944ddca5b609b93224c65adf992ad4d"} Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.526570 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.566829 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.612143 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.651073 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.690520 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.735010 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.770604 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.810581 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.854144 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.889028 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.926764 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:06 crc kubenswrapper[4669]: I1008 20:45:06.969276 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.009245 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.047408 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.091713 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.127946 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.171029 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.210253 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.246901 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.287157 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.326686 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.520242 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c" containerID="6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9" exitCode=0 Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.520296 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerDied","Data":"6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9"} Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.523968 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9"} Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.525483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flswm" event={"ID":"609156f9-39b1-4330-83a2-eabf82f4228f","Type":"ContainerStarted","Data":"749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52"} Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.541078 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.555260 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.573814 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.589782 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.606317 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.621164 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.632202 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.646972 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.690678 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.728675 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.773884 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.807936 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.847182 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.888214 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.939574 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:07 crc kubenswrapper[4669]: I1008 20:45:07.967610 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.011925 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.058015 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.091120 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.130437 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.169653 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.208095 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.247612 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.288456 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.330105 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.330151 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.330112 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.330245 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.330316 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.330401 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.335004 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.369673 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.410177 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.455513 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.487291 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.525636 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.530128 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c" containerID="3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f" exitCode=0 Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.530186 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerDied","Data":"3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f"} Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.568973 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.608647 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.621520 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.624051 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.624081 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.624089 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.624169 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.647921 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.700953 4669 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.701223 4669 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.702359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.702402 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.702412 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.702430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.702442 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.717429 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.720757 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.720799 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.720809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.720826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.720837 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.733247 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.734880 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.737506 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.737571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.737587 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.737606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.737618 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.750896 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.754590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.754625 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.754634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.754649 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.754659 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.765344 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.766665 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.772896 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.772930 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.772939 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.772952 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.772961 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.784646 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: E1008 20:45:08.784830 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.787790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.787826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.787835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.787850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.787860 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.807147 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.848162 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.888706 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.890147 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.890174 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.890182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.890199 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.890207 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.932919 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.969808 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:08Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.992962 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.993002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.993013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.993028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:08 crc kubenswrapper[4669]: I1008 20:45:08.993036 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:08Z","lastTransitionTime":"2025-10-08T20:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.010649 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.048317 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.091374 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.094938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.095017 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.095035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.095067 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.095089 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.126291 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.169802 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.198141 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.198172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.198182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.198201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.198213 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.300071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.300112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.300123 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.300136 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.300146 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.406712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.406774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.406801 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.406816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.406825 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.509469 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.509509 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.509520 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.509551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.509562 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.537039 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.544730 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c" containerID="41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e" exitCode=0 Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.544785 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerDied","Data":"41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.567376 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.589737 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.605885 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.611383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.611436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.611449 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.611467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.611479 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.626241 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.638177 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.649496 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.661776 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.672098 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.692025 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.705375 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.713510 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.713561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.713575 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.713590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.713601 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.722750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.735407 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.750308 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.760101 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.772836 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:09Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.825738 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.825777 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.825785 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.825799 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.825808 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.928786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.928901 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.928917 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.928941 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.928958 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:09Z","lastTransitionTime":"2025-10-08T20:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:09 crc kubenswrapper[4669]: I1008 20:45:09.938865 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:09 crc kubenswrapper[4669]: E1008 20:45:09.939103 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:45:17.939076784 +0000 UTC m=+37.631887457 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.031904 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.031964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.031986 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.032014 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.032035 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.039617 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.039720 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.039784 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.039841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.039890 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.039994 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:18.03996488 +0000 UTC m=+37.732775593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040041 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040067 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040087 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040144 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:18.040125565 +0000 UTC m=+37.732936278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040201 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040245 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:18.040231518 +0000 UTC m=+37.733042231 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040322 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040354 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040369 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.040409 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:18.040396382 +0000 UTC m=+37.733207085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.135153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.135213 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.135250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.135282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.135303 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.238751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.239192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.239202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.239218 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.239230 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.330107 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.330137 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.330304 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.330149 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.330373 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:10 crc kubenswrapper[4669]: E1008 20:45:10.330552 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.341842 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.341890 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.341907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.341929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.341944 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.443868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.443922 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.443935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.443956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.443967 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.547006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.547054 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.547064 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.547078 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.547087 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.553219 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c" containerID="762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758" exitCode=0 Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.553258 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerDied","Data":"762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.573272 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.594940 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.611179 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.624016 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.640218 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.648926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.648981 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.648993 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.649011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.649023 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.654797 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.675285 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.687373 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.702023 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.713755 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.727201 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.739191 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.751717 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.751760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.751797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.751814 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.751825 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.758266 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.768628 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.778233 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:10Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.855158 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.855236 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.855273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.855306 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.855333 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.957564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.957603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.957614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.957630 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:10 crc kubenswrapper[4669]: I1008 20:45:10.957642 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:10Z","lastTransitionTime":"2025-10-08T20:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.059393 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.059740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.059754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.059771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.059785 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.161934 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.161973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.161985 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.162002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.162014 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.264214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.264246 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.264254 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.264268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.264277 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.342634 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.356667 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.365972 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.366012 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.366024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.366043 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.366054 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.370159 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.385073 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.405405 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.422143 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.437199 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.452140 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.466225 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.468104 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.468139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.468149 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.468165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.468175 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.479905 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.489238 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.501573 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.515353 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.527279 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.545177 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.561464 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.561916 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.562024 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.565679 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" event={"ID":"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c","Type":"ContainerStarted","Data":"09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.570219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.570252 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.570260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.570272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.570282 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.578931 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.594095 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.597834 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.613577 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.623477 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.635511 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.649277 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.672898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.672938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.672949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.672966 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.672977 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.674332 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.688638 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.710416 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.721615 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.732359 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.746013 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.757026 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.770779 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.775308 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.775377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.775396 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.775424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.775443 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.781627 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.811505 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.823042 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.834485 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.845882 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.858232 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.870445 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.878040 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.878065 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.878074 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.878111 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.878124 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.879684 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.888970 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.932089 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.974285 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:11Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.980949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.980987 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.980998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.981013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:11 crc kubenswrapper[4669]: I1008 20:45:11.981022 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:11Z","lastTransitionTime":"2025-10-08T20:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.022282 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.052793 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.083592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.083663 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.083676 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.083695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.083729 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.092030 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.134439 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.166115 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.186320 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.186373 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.186391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.186414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.186432 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.290006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.290076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.290100 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.290133 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.290157 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.329848 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.329849 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.329863 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:12 crc kubenswrapper[4669]: E1008 20:45:12.330002 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:12 crc kubenswrapper[4669]: E1008 20:45:12.330322 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:12 crc kubenswrapper[4669]: E1008 20:45:12.330393 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.393002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.393070 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.393083 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.393108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.393123 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.496301 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.496362 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.496383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.496410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.496428 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.570050 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.592233 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.599007 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.599038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.599047 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.599060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.599068 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.606332 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.670837 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.684600 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.695657 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.701418 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.701456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.701464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.701478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.701487 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.708297 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.719433 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.729616 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.739962 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.754415 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.765798 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.785754 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.800227 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.804082 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.804117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.804127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.804140 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.804149 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.817655 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.834013 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.850232 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:12Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.906584 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.906649 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.906673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.906700 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:12 crc kubenswrapper[4669]: I1008 20:45:12.906721 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:12Z","lastTransitionTime":"2025-10-08T20:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.009116 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.009192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.009216 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.009245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.009272 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.112580 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.112624 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.112640 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.112661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.112678 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.214839 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.214884 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.214895 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.214911 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.214921 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.316973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.317004 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.317014 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.317031 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.317042 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.420157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.420212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.420234 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.420264 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.420287 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.524061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.524138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.524162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.524192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.524214 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.626589 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.626633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.626645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.626665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.626678 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.728891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.728940 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.728951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.728967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.728979 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.831641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.831716 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.831737 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.831764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.831783 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.934040 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.934075 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.934087 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.934105 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:13 crc kubenswrapper[4669]: I1008 20:45:13.934117 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:13Z","lastTransitionTime":"2025-10-08T20:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.036331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.036417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.036441 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.036472 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.036495 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.139554 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.139624 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.139637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.139655 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.139668 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.242841 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.242901 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.242915 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.242933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.243260 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.330399 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:14 crc kubenswrapper[4669]: E1008 20:45:14.330577 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.330654 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.330707 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:14 crc kubenswrapper[4669]: E1008 20:45:14.330821 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:14 crc kubenswrapper[4669]: E1008 20:45:14.330924 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.346795 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.346839 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.346848 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.346863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.346872 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.449016 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.449052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.449061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.449074 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.449083 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.550803 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.550855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.550868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.550885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.550897 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.577429 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/0.log" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.580243 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238" exitCode=1 Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.580281 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.580859 4669 scope.go:117] "RemoveContainer" containerID="b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.605451 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.623587 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.638234 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.651338 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.653342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.653379 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.653393 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.653415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.653430 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.665026 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.676824 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.688314 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.700210 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.714145 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.726951 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.745018 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:14Z\\\",\\\"message\\\":\\\"o:140\\\\nI1008 20:45:14.483954 5980 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 20:45:14.484248 5980 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484282 5980 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484621 5980 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 20:45:14.484645 5980 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 20:45:14.484675 5980 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 20:45:14.484689 5980 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 20:45:14.484725 5980 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 20:45:14.484752 5980 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 20:45:14.484759 5980 factory.go:656] Stopping watch factory\\\\nI1008 20:45:14.484775 5980 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 20:45:14.484784 5980 ovnkube.go:599] Stopped ovnkube\\\\nI1008 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.755605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.755637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.755645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.755660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.755669 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.759337 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.773069 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.786747 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.800070 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:14Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.860282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.860334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.860348 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.860368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.860385 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.962756 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.962802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.962814 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.962832 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:14 crc kubenswrapper[4669]: I1008 20:45:14.962844 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:14Z","lastTransitionTime":"2025-10-08T20:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.064688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.064719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.064729 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.064741 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.064750 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.166950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.166989 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.166997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.167011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.167020 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.269882 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.269920 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.269928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.269940 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.269950 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.372103 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.372176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.372200 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.372227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.372248 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.474873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.474918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.474928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.474942 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.474951 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.579875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.579929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.579943 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.579962 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.579972 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.585165 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/0.log" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.588124 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.588698 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.600693 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.616273 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.634029 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.649854 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.671059 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:14Z\\\",\\\"message\\\":\\\"o:140\\\\nI1008 20:45:14.483954 5980 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 20:45:14.484248 5980 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484282 5980 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484621 5980 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 20:45:14.484645 5980 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 20:45:14.484675 5980 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 20:45:14.484689 5980 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 20:45:14.484725 5980 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 20:45:14.484752 5980 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 20:45:14.484759 5980 factory.go:656] Stopping watch factory\\\\nI1008 20:45:14.484775 5980 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 20:45:14.484784 5980 ovnkube.go:599] Stopped ovnkube\\\\nI1008 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.682619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.682669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.682684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.682705 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.682720 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.690036 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.706742 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.724316 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.740262 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.750561 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.769500 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.783713 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.785321 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.785375 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.785389 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.785410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.785422 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.803494 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.821912 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.877821 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.887855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.887898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.887913 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.887932 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.887946 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.989823 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.989873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.989888 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.989910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:15 crc kubenswrapper[4669]: I1008 20:45:15.989926 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:15Z","lastTransitionTime":"2025-10-08T20:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.092956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.092998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.093008 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.093025 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.093036 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.196558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.196637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.196657 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.196685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.196705 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.299493 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.299559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.299575 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.299593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.299608 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.329997 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.330096 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:16 crc kubenswrapper[4669]: E1008 20:45:16.330159 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.330002 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:16 crc kubenswrapper[4669]: E1008 20:45:16.330258 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:16 crc kubenswrapper[4669]: E1008 20:45:16.330337 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.331377 4669 scope.go:117] "RemoveContainer" containerID="8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.403623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.403670 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.403684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.403704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.403718 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.506616 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.506664 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.506675 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.506691 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.506702 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.593424 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/1.log" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.593887 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/0.log" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.597426 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08" exitCode=1 Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.597496 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.597548 4669 scope.go:117] "RemoveContainer" containerID="b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.597969 4669 scope.go:117] "RemoveContainer" containerID="79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08" Oct 08 20:45:16 crc kubenswrapper[4669]: E1008 20:45:16.598099 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.599709 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.601391 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.601837 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.609156 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.609188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.609197 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.609212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.609223 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.613873 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.625946 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.638321 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.648860 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv"] Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.649395 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.651500 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.653228 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.660799 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.672750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.683708 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.694630 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.704927 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.710190 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbw65\" (UniqueName: \"kubernetes.io/projected/5ac60c10-afa3-424e-9aa2-060e32f4a40f-kube-api-access-pbw65\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.710390 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ac60c10-afa3-424e-9aa2-060e32f4a40f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.710446 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ac60c10-afa3-424e-9aa2-060e32f4a40f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.710512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ac60c10-afa3-424e-9aa2-060e32f4a40f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.711367 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.711417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.711429 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.711446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.711460 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.726067 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:14Z\\\",\\\"message\\\":\\\"o:140\\\\nI1008 20:45:14.483954 5980 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 20:45:14.484248 5980 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484282 5980 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484621 5980 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 20:45:14.484645 5980 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 20:45:14.484675 5980 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 20:45:14.484689 5980 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 20:45:14.484725 5980 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 20:45:14.484752 5980 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 20:45:14.484759 5980 factory.go:656] Stopping watch factory\\\\nI1008 20:45:14.484775 5980 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 20:45:14.484784 5980 ovnkube.go:599] Stopped ovnkube\\\\nI1008 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.740648 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.752936 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.764664 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.777750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.786623 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.801940 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.811551 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbw65\" (UniqueName: \"kubernetes.io/projected/5ac60c10-afa3-424e-9aa2-060e32f4a40f-kube-api-access-pbw65\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.811663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ac60c10-afa3-424e-9aa2-060e32f4a40f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.811717 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ac60c10-afa3-424e-9aa2-060e32f4a40f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.811787 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ac60c10-afa3-424e-9aa2-060e32f4a40f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.812448 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ac60c10-afa3-424e-9aa2-060e32f4a40f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.814378 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.814417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.814430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.814446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.814461 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.815346 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ac60c10-afa3-424e-9aa2-060e32f4a40f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.816799 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.817885 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ac60c10-afa3-424e-9aa2-060e32f4a40f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.835342 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.837746 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbw65\" (UniqueName: \"kubernetes.io/projected/5ac60c10-afa3-424e-9aa2-060e32f4a40f-kube-api-access-pbw65\") pod \"ovnkube-control-plane-749d76644c-bl6pv\" (UID: \"5ac60c10-afa3-424e-9aa2-060e32f4a40f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.857843 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.867277 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.887890 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.900735 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.914912 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.916605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.916637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.916647 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.916664 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.916674 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:16Z","lastTransitionTime":"2025-10-08T20:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.930777 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.942018 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.953331 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.961978 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.964878 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.977076 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:16 crc kubenswrapper[4669]: W1008 20:45:16.977916 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac60c10_afa3_424e_9aa2_060e32f4a40f.slice/crio-850aec39a232b961a47dccbc708b134c2b652c3b605610827f1fa5e8497fc954 WatchSource:0}: Error finding container 850aec39a232b961a47dccbc708b134c2b652c3b605610827f1fa5e8497fc954: Status 404 returned error can't find the container with id 850aec39a232b961a47dccbc708b134c2b652c3b605610827f1fa5e8497fc954 Oct 08 20:45:16 crc kubenswrapper[4669]: I1008 20:45:16.989752 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:16Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.005770 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.018782 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.019401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.019502 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.019615 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.019692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.019769 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.040587 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:14Z\\\",\\\"message\\\":\\\"o:140\\\\nI1008 20:45:14.483954 5980 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 20:45:14.484248 5980 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484282 5980 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484621 5980 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 20:45:14.484645 5980 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 20:45:14.484675 5980 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 20:45:14.484689 5980 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 20:45:14.484725 5980 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 20:45:14.484752 5980 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 20:45:14.484759 5980 factory.go:656] Stopping watch factory\\\\nI1008 20:45:14.484775 5980 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 20:45:14.484784 5980 ovnkube.go:599] Stopped ovnkube\\\\nI1008 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.121887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.121918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.121930 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.121946 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.121957 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.224318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.224359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.224372 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.224390 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.224405 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.326610 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.326660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.326671 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.326687 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.326698 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.428599 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.428644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.428657 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.428673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.428684 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.530713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.530755 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.530767 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.530782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.530792 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.605643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" event={"ID":"5ac60c10-afa3-424e-9aa2-060e32f4a40f","Type":"ContainerStarted","Data":"7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.605705 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" event={"ID":"5ac60c10-afa3-424e-9aa2-060e32f4a40f","Type":"ContainerStarted","Data":"91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.605720 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" event={"ID":"5ac60c10-afa3-424e-9aa2-060e32f4a40f","Type":"ContainerStarted","Data":"850aec39a232b961a47dccbc708b134c2b652c3b605610827f1fa5e8497fc954"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.607378 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/1.log" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.611541 4669 scope.go:117] "RemoveContainer" containerID="79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08" Oct 08 20:45:17 crc kubenswrapper[4669]: E1008 20:45:17.611688 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.620922 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.633269 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.633329 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.633341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.633356 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.633367 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.637710 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c98951504fc67f4ee155f081da1039b39da4d528ddd7b4c2ddba8bbfd06238\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:14Z\\\",\\\"message\\\":\\\"o:140\\\\nI1008 20:45:14.483954 5980 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 20:45:14.484248 5980 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484282 5980 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 20:45:14.484621 5980 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 20:45:14.484645 5980 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 20:45:14.484675 5980 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 20:45:14.484689 5980 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 20:45:14.484725 5980 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 20:45:14.484752 5980 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 20:45:14.484759 5980 factory.go:656] Stopping watch factory\\\\nI1008 20:45:14.484775 5980 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 20:45:14.484784 5980 ovnkube.go:599] Stopped ovnkube\\\\nI1008 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.651958 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.664878 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.678714 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.687980 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.700212 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.735738 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.735788 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.735805 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.735830 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.735846 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.747697 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.759973 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ml9vv"] Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.760473 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:17 crc kubenswrapper[4669]: E1008 20:45:17.760570 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.771050 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.783951 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.797115 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.815770 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.821638 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bh59\" (UniqueName: \"kubernetes.io/projected/f90eed21-8bc2-4723-b6be-a672669a36fb-kube-api-access-5bh59\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.821688 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.826769 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.837971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.838010 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.838024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.838041 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.838053 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.839031 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.848897 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.860670 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.871056 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.883504 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.904053 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.916447 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.922600 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bh59\" (UniqueName: \"kubernetes.io/projected/f90eed21-8bc2-4723-b6be-a672669a36fb-kube-api-access-5bh59\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.922668 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:17 crc kubenswrapper[4669]: E1008 20:45:17.922787 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:17 crc kubenswrapper[4669]: E1008 20:45:17.922843 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:45:18.422826078 +0000 UTC m=+38.115636761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.937388 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.939401 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bh59\" (UniqueName: \"kubernetes.io/projected/f90eed21-8bc2-4723-b6be-a672669a36fb-kube-api-access-5bh59\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.940081 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.940142 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.940163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.940188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.940207 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:17Z","lastTransitionTime":"2025-10-08T20:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.952459 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.965282 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.977957 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:17 crc kubenswrapper[4669]: I1008 20:45:17.989150 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:17Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.006019 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.019890 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.023092 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.023334 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:45:34.023315092 +0000 UTC m=+53.716125785 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.033369 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.042631 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.042678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.042695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.042717 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.042731 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.046238 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.056876 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.068515 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.081725 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.099660 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:18Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.124512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.124790 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.124729 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.125008 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125116 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125162 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:34.125129136 +0000 UTC m=+53.817939809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.125427 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125194 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125729 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125840 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125517 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:34.125505086 +0000 UTC m=+53.818315759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.126021 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:34.126011721 +0000 UTC m=+53.818822394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.125640 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.126162 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.126214 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.126295 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:45:34.126287529 +0000 UTC m=+53.819098202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.144917 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.145056 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.145258 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.145448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.145641 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.248687 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.248986 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.249101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.249242 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.249360 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.330184 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.330238 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.330274 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.330305 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.330434 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.330652 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.351751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.351787 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.351796 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.351811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.351822 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.429126 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.429294 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: E1008 20:45:18.429391 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:45:19.429369118 +0000 UTC m=+39.122179801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.453807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.453836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.453848 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.453862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.453872 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.556959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.557022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.557040 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.557062 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.557086 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.659701 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.659798 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.659818 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.659876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.659896 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.762908 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.762944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.762955 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.762971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.762982 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.865834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.865914 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.865925 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.865945 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.865964 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.968404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.968434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.968442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.968455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:18 crc kubenswrapper[4669]: I1008 20:45:18.968463 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:18Z","lastTransitionTime":"2025-10-08T20:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.071315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.071358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.071370 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.071393 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.071406 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.161500 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.161579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.161592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.161613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.161625 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.174614 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:19Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.178907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.178958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.178970 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.178990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.179002 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.194011 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:19Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.197331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.197371 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.197383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.197400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.197412 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.207837 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:19Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.211295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.211330 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.211348 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.211364 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.211374 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.226022 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:19Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.229816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.229874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.229886 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.229900 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.229910 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.243551 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:19Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.243677 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.245361 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.245393 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.245402 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.245417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.245430 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.330448 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.330734 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.347758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.347790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.347815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.347827 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.347836 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.439971 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.440115 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:19 crc kubenswrapper[4669]: E1008 20:45:19.440162 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:45:21.4401482 +0000 UTC m=+41.132958873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.449754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.449791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.449799 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.449811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.449821 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.553112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.553194 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.553222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.553255 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.553279 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.656439 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.656511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.656563 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.656617 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.656638 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.759274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.759332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.759351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.759373 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.759390 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.862221 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.862269 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.862283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.862302 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.862316 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.965092 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.965144 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.965155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.965171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:19 crc kubenswrapper[4669]: I1008 20:45:19.965182 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:19Z","lastTransitionTime":"2025-10-08T20:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.068665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.068730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.068752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.068775 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.068826 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.171316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.171361 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.171372 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.171389 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.171402 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.274407 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.274482 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.274501 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.274582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.274617 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.330336 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.330422 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:20 crc kubenswrapper[4669]: E1008 20:45:20.330494 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.330574 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:20 crc kubenswrapper[4669]: E1008 20:45:20.330660 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:20 crc kubenswrapper[4669]: E1008 20:45:20.330826 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.377656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.377721 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.377745 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.377777 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.377799 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.480260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.480331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.480352 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.480377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.480397 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.582519 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.582608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.582622 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.582665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.582677 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.685470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.685579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.685593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.685608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.685621 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.788426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.788569 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.788605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.788645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.788664 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.891363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.891391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.891401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.891415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.891425 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.994283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.994332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.994348 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.994364 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:20 crc kubenswrapper[4669]: I1008 20:45:20.994375 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:20Z","lastTransitionTime":"2025-10-08T20:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.096161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.096228 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.096242 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.096260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.096272 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.199019 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.199056 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.199065 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.199080 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.199089 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.301951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.302000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.302010 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.302028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.302039 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.330435 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:21 crc kubenswrapper[4669]: E1008 20:45:21.330609 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.347204 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.361625 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.384007 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.396617 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.404440 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.404484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.404497 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.404517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.404554 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.408653 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.425120 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.434636 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.445709 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.457123 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.461511 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:21 crc kubenswrapper[4669]: E1008 20:45:21.461736 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:21 crc kubenswrapper[4669]: E1008 20:45:21.461793 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:45:25.461778206 +0000 UTC m=+45.154588879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.466939 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.481837 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.495698 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.506336 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.506383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.506399 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.506421 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.506435 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.511956 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.529157 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.540880 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.553960 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.564865 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:21Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.609006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.609046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.609058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.609072 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.609082 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.712015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.712056 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.712073 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.712095 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.712111 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.814188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.814218 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.814227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.814240 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.814254 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.916499 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.916562 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.916572 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.916588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:21 crc kubenswrapper[4669]: I1008 20:45:21.916598 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:21Z","lastTransitionTime":"2025-10-08T20:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.020132 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.020184 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.020196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.020210 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.020220 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.122708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.122740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.122750 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.122763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.122791 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.224988 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.225046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.225066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.225088 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.225105 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.327997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.328052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.328060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.328080 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.328089 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.330314 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:22 crc kubenswrapper[4669]: E1008 20:45:22.330436 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.330574 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.330737 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:22 crc kubenswrapper[4669]: E1008 20:45:22.330824 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:22 crc kubenswrapper[4669]: E1008 20:45:22.331014 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.431180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.431225 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.431234 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.431248 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.431261 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.534561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.534611 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.534621 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.534638 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.534651 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.636893 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.636935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.636947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.636964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.636976 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.739097 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.739135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.739148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.739165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.739176 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.842515 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.842572 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.842581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.842596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.842607 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.945863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.945910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.945926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.945951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:22 crc kubenswrapper[4669]: I1008 20:45:22.945967 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:22Z","lastTransitionTime":"2025-10-08T20:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.048570 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.048622 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.048633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.048649 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.048662 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.150771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.150835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.150852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.150872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.150888 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.253100 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.253157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.253175 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.253198 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.253218 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.330256 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:23 crc kubenswrapper[4669]: E1008 20:45:23.330421 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.355969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.356004 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.356014 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.356030 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.356041 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.458885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.458927 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.458938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.458955 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.458966 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.561601 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.561650 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.561661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.561679 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.561692 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.664101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.664160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.664175 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.664193 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.664205 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.767079 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.767141 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.767151 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.767165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.767174 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.869786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.869838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.869852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.869874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.869887 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.972376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.972415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.972426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.972443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:23 crc kubenswrapper[4669]: I1008 20:45:23.972454 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:23Z","lastTransitionTime":"2025-10-08T20:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.075022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.075049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.075056 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.075071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.075079 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.177083 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.177158 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.177167 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.177181 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.177191 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.279195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.279239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.279247 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.279262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.279271 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.329724 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.329776 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.329799 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:24 crc kubenswrapper[4669]: E1008 20:45:24.329865 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:24 crc kubenswrapper[4669]: E1008 20:45:24.329934 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:24 crc kubenswrapper[4669]: E1008 20:45:24.330010 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.382573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.382644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.382658 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.382702 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.382714 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.485996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.486060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.486076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.486102 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.486119 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.588188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.588234 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.588244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.588261 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.588272 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.690834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.690898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.690908 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.690923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.690933 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.793776 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.793812 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.793822 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.793837 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.793848 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.895734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.895780 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.895792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.895810 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.895821 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.997847 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.997892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.997904 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.997922 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:24 crc kubenswrapper[4669]: I1008 20:45:24.997938 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:24Z","lastTransitionTime":"2025-10-08T20:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.101069 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.101138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.101147 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.101164 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.101177 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.203277 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.203318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.203327 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.203341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.203351 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.304995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.305050 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.305061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.305075 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.305086 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.330558 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:25 crc kubenswrapper[4669]: E1008 20:45:25.330736 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.407425 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.407504 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.407571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.407605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.407672 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.499453 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:25 crc kubenswrapper[4669]: E1008 20:45:25.499695 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:25 crc kubenswrapper[4669]: E1008 20:45:25.499758 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:45:33.499743253 +0000 UTC m=+53.192553926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.510982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.511047 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.511064 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.511087 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.511104 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.613854 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.613945 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.613983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.614014 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.614038 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.717292 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.717380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.717416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.717448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.717472 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.820510 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.820567 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.820582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.820600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.820612 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.923804 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.923863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.923921 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.923952 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:25 crc kubenswrapper[4669]: I1008 20:45:25.923978 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:25Z","lastTransitionTime":"2025-10-08T20:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.026353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.026405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.026425 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.026443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.026454 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.128466 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.128509 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.128540 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.128558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.128570 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.230867 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.230910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.230922 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.230938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.230950 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.329849 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.329886 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.329849 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:26 crc kubenswrapper[4669]: E1008 20:45:26.329996 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:26 crc kubenswrapper[4669]: E1008 20:45:26.330187 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:26 crc kubenswrapper[4669]: E1008 20:45:26.330286 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.334255 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.334295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.334321 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.334341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.334365 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.436558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.436598 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.436609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.436626 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.436637 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.539825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.539864 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.539873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.539892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.539907 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.642254 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.642303 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.642317 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.642335 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.642348 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.744755 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.744825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.744847 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.744876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.744899 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.847160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.847210 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.847222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.847238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.847249 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.949779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.949822 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.949834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.949849 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:26 crc kubenswrapper[4669]: I1008 20:45:26.949860 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:26Z","lastTransitionTime":"2025-10-08T20:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.052928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.052997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.053011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.053030 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.053044 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.156053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.156126 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.156146 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.156171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.156187 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.258969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.259027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.259045 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.259068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.259084 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.330757 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:27 crc kubenswrapper[4669]: E1008 20:45:27.330999 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.362327 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.362378 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.362390 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.362410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.362423 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.466047 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.466109 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.466131 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.466160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.466179 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.568133 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.568376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.568444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.568514 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.568639 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.670807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.671068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.671145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.671206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.671270 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.773674 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.773708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.773719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.773734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.773745 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.862271 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.875902 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.875974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.875997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.876028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.876051 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.879581 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.899825 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.917572 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.934071 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.956793 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.966276 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.977744 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.978437 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.978475 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.978483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.978498 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.978508 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:27Z","lastTransitionTime":"2025-10-08T20:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:27 crc kubenswrapper[4669]: I1008 20:45:27.996657 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:27Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.009073 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.020397 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.035155 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.046110 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.054984 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.063926 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.073594 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.081214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.081254 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.081266 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.081282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.081293 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.083305 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.094102 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:28Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.183757 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.183797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.183806 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.183821 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.183833 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.286421 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.286464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.286477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.286499 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.286512 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.329704 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.329765 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.329704 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:28 crc kubenswrapper[4669]: E1008 20:45:28.329827 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:28 crc kubenswrapper[4669]: E1008 20:45:28.329973 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:28 crc kubenswrapper[4669]: E1008 20:45:28.330054 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.389622 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.389692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.389704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.389720 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.389732 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.492401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.492467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.492484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.492507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.492545 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.594349 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.594386 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.594396 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.594408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.594474 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.696427 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.696503 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.696514 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.696546 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.696555 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.798642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.798730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.798744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.798763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.798776 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.902200 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.902260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.902269 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.902290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:28 crc kubenswrapper[4669]: I1008 20:45:28.902300 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:28Z","lastTransitionTime":"2025-10-08T20:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.005082 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.005134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.005145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.005166 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.005178 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.108189 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.108238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.108249 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.108270 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.108283 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.211631 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.211687 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.211702 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.211726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.211743 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.287237 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.287328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.287345 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.287371 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.287387 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.303111 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:29Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.308560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.308634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.308646 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.308667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.308681 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.322728 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:29Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.327413 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.327475 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.327487 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.327507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.327521 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.330473 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.330639 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.342419 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:29Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.347311 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.347368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.347382 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.347403 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.347416 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.365321 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:29Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.370243 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.370303 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.370314 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.370336 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.370351 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.391138 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:29Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:29 crc kubenswrapper[4669]: E1008 20:45:29.391302 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.393325 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.393373 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.393385 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.393408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.393422 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.496688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.496746 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.496758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.496775 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.496789 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.599578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.599623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.599634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.599650 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.599662 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.702468 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.702507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.702518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.702567 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.702582 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.805470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.805609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.805637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.805666 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.805687 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.908013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.908091 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.908110 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.908134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:29 crc kubenswrapper[4669]: I1008 20:45:29.908150 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:29Z","lastTransitionTime":"2025-10-08T20:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.011276 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.011333 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.011345 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.011367 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.011380 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.114578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.114646 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.114659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.114680 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.114692 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.218711 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.218778 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.218792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.218814 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.218830 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.321745 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.321828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.321842 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.321864 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.321879 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.330072 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.330116 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.330237 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:30 crc kubenswrapper[4669]: E1008 20:45:30.330347 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:30 crc kubenswrapper[4669]: E1008 20:45:30.330249 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:30 crc kubenswrapper[4669]: E1008 20:45:30.330498 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.424274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.424978 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.424991 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.425009 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.425022 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.527207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.527265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.527275 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.527287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.527296 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.629843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.629892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.629910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.629926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.629937 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.732063 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.732148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.732182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.732212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.732232 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.834126 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.834197 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.834208 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.834225 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.834235 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.936811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.936879 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.936901 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.936931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:30 crc kubenswrapper[4669]: I1008 20:45:30.936953 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:30Z","lastTransitionTime":"2025-10-08T20:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.039792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.039852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.039870 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.039894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.039910 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.142279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.142318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.142327 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.142342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.142353 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.244945 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.245053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.245074 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.245119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.245143 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.330443 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:31 crc kubenswrapper[4669]: E1008 20:45:31.330602 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.348971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.349057 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.349076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.349101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.349125 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.352698 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.382402 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.402558 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.420573 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.441061 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.451427 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.451483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.451496 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.451517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.451749 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.455105 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.468126 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.483255 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.494340 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.507201 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.522573 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.533765 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.549632 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.554592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.554669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.554692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.554725 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.554749 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.559927 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.572108 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.582239 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.594740 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:31Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.657399 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.657513 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.657564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.657594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.657615 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.760369 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.760405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.760413 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.760431 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.760442 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.862918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.862947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.862957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.862973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.862983 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.966643 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.966725 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.966742 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.967202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:31 crc kubenswrapper[4669]: I1008 20:45:31.967261 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:31Z","lastTransitionTime":"2025-10-08T20:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.070667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.070726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.070744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.070767 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.070781 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.173908 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.173976 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.173995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.174021 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.174039 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.276109 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.276151 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.276161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.276175 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.276184 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.330734 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.330775 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.331194 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:32 crc kubenswrapper[4669]: E1008 20:45:32.331313 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:32 crc kubenswrapper[4669]: E1008 20:45:32.331458 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:32 crc kubenswrapper[4669]: E1008 20:45:32.331694 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.331712 4669 scope.go:117] "RemoveContainer" containerID="79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.378377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.378428 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.378439 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.378464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.378479 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.481685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.481726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.481739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.481760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.481772 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.585190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.585246 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.585256 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.585275 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.585287 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.660291 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/1.log" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.663185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.663931 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.679266 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.687642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.687681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.687690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.687709 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.687722 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.700206 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.719071 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.741307 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.765249 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.781747 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.794791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.794850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.794860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.794880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.794893 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.802722 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.821769 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.845645 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.860214 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.876232 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.892608 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.897038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.897070 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.897080 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.897094 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.897106 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:32Z","lastTransitionTime":"2025-10-08T20:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.904148 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.924389 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.936610 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.950438 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:32 crc kubenswrapper[4669]: I1008 20:45:32.966488 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:32Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.000144 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.000236 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.000251 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.000273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.000290 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.103376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.103424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.103436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.103455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.103469 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.206522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.206589 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.206599 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.206630 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.206639 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.309931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.309972 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.309983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.309998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.310010 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.330462 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:33 crc kubenswrapper[4669]: E1008 20:45:33.330653 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.412478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.412578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.412603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.412632 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.412652 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.515399 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.515440 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.515452 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.515467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.515479 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.587449 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:33 crc kubenswrapper[4669]: E1008 20:45:33.587652 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:33 crc kubenswrapper[4669]: E1008 20:45:33.587747 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:45:49.587725929 +0000 UTC m=+69.280536622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.618160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.618207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.618218 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.618237 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.618250 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.669393 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/2.log" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.670466 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/1.log" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.674357 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7" exitCode=1 Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.674423 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.674488 4669 scope.go:117] "RemoveContainer" containerID="79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.675641 4669 scope.go:117] "RemoveContainer" containerID="6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7" Oct 08 20:45:33 crc kubenswrapper[4669]: E1008 20:45:33.675938 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.696725 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.712175 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.720799 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.720829 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.720839 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.720854 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.720865 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.729726 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.748199 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.767178 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.786616 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.808272 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.823589 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.824005 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.824055 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.824111 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.824146 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.824167 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.837254 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.851333 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.867388 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.888011 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.904344 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.918312 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.926626 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.926703 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.926726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.926757 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.926779 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:33Z","lastTransitionTime":"2025-10-08T20:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.940646 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.956822 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:33 crc kubenswrapper[4669]: I1008 20:45:33.975000 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79887114a7a0f726ed3bcb0b6d5e32e90cabfe64a4d5d13f6c1c0e2d01ae9f08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:15Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:15Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:45:15.384024 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1008 20:45:15.383454 6116 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gpzdw after 0 failed attempt(s)\\\\nI1008 20:45:15.384045 6116 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1008 20:45:15.384049 6116 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gpzdw\\\\nI1008 20:45:15.384024 6116 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:33Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.029031 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.029070 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.029081 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.029097 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.029109 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.092604 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.092946 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:06.092898279 +0000 UTC m=+85.785709072 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.131334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.131394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.131406 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.131427 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.131439 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.194363 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.194451 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.194498 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.194568 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194580 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194676 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:46:06.19465178 +0000 UTC m=+85.887462463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194675 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194781 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:46:06.194755103 +0000 UTC m=+85.887565816 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194842 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194881 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194916 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194851 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194975 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.194991 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.195042 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:46:06.1950084 +0000 UTC m=+85.887819113 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.195068 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:46:06.195058982 +0000 UTC m=+85.887869665 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.234411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.234466 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.234483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.234505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.234521 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.330294 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.330442 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.330646 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.330669 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.330816 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.330916 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.337434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.337505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.337583 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.337620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.337643 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.441231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.441315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.441328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.441350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.441363 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.544812 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.544862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.544875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.544898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.544914 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.647694 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.647739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.647753 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.647772 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.647784 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.679334 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/2.log" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.682903 4669 scope.go:117] "RemoveContainer" containerID="6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7" Oct 08 20:45:34 crc kubenswrapper[4669]: E1008 20:45:34.683094 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.694782 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.710916 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.729213 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.749841 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.749950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.749966 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.749988 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.750001 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.752757 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.766750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.789336 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.802874 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.805639 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.816157 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.821204 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.836831 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.849598 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.852108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.852160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.852171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.852189 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.852202 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.867780 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.880027 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.893355 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.907711 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.921192 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.941403 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.954340 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.954379 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.954391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.954404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.954414 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:34Z","lastTransitionTime":"2025-10-08T20:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:34 crc kubenswrapper[4669]: I1008 20:45:34.982313 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.001516 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:34Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.020866 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.033601 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.044367 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.056207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.056244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.056262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.056279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.056290 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.058400 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.073008 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.089836 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.101771 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.112224 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.122579 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.135745 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.149036 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.158691 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.158722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.158730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.158745 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.158758 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.165690 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.177578 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.192193 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.203107 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.218743 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.241741 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:35Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.260662 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.260696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.260706 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.260722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.260732 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.330267 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:35 crc kubenswrapper[4669]: E1008 20:45:35.330502 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.362969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.362998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.363006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.363020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.363032 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.466162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.466207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.466219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.466237 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.466249 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.568103 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.568163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.568178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.568199 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.568215 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.671769 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.672086 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.672124 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.672153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.672174 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.776155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.776231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.776251 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.776277 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.776295 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.901451 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.901500 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.901513 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.901558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:35 crc kubenswrapper[4669]: I1008 20:45:35.901575 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:35Z","lastTransitionTime":"2025-10-08T20:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.004389 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.004423 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.004431 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.004448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.004457 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.111451 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.111854 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.111999 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.112161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.112285 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.215586 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.215659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.215692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.215728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.215749 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.317946 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.317976 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.317985 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.317998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.318008 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.330734 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.330744 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.330800 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:36 crc kubenswrapper[4669]: E1008 20:45:36.331089 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:36 crc kubenswrapper[4669]: E1008 20:45:36.330935 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:36 crc kubenswrapper[4669]: E1008 20:45:36.331214 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.420362 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.420419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.420436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.420458 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.420474 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.523043 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.523088 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.523100 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.523116 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.523129 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.626237 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.626281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.626298 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.626321 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.626338 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.728847 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.728872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.728880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.728892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.728902 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.831432 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.831492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.831514 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.831582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.831606 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.934217 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.934270 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.934286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.934310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:36 crc kubenswrapper[4669]: I1008 20:45:36.934327 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:36Z","lastTransitionTime":"2025-10-08T20:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.037571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.037641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.037665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.037694 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.037716 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.140319 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.140368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.140380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.140396 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.140410 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.243242 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.243294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.243309 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.243332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.243345 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.330290 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:37 crc kubenswrapper[4669]: E1008 20:45:37.330563 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.346043 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.346113 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.346137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.346166 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.346190 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.449153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.449195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.449207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.449223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.449233 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.551728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.551762 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.551770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.551781 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.551790 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.655259 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.655307 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.655318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.655331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.655339 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.757944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.757985 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.757996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.758011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.758022 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.860271 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.860341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.860366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.860398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.860421 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.962789 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.962857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.962873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.962893 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:37 crc kubenswrapper[4669]: I1008 20:45:37.962908 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:37Z","lastTransitionTime":"2025-10-08T20:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.066142 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.066206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.066229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.066261 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.066282 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.168495 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.168553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.168568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.168587 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.168600 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.270926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.271623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.271644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.271665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.271680 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.329782 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:38 crc kubenswrapper[4669]: E1008 20:45:38.329927 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.329950 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.329992 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:38 crc kubenswrapper[4669]: E1008 20:45:38.330049 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:38 crc kubenswrapper[4669]: E1008 20:45:38.330138 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.374401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.374437 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.374447 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.374461 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.374498 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.477076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.477126 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.477135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.477150 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.477159 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.579820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.580076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.580139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.580219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.580300 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.683459 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.683490 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.683499 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.683513 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.683522 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.785974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.786021 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.786038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.786060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.786073 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.888026 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.888060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.888068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.888082 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.888091 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.991303 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.991874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.992052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.992201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:38 crc kubenswrapper[4669]: I1008 20:45:38.992339 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:38Z","lastTransitionTime":"2025-10-08T20:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.094960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.095246 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.095331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.095438 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.095549 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.198341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.198406 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.198430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.198685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.198714 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.301618 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.301681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.301698 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.301724 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.301740 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.330401 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.330654 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.404826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.404901 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.404939 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.404973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.404996 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.508584 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.508639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.508657 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.508680 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.508698 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.612162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.612210 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.612220 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.612235 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.612246 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.714869 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.714918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.714929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.714952 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.714962 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.781779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.782135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.782282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.782395 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.782480 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.796038 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:39Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.800138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.800283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.800350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.800424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.800486 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.811632 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:39Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.815138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.815162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.815170 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.815182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.815192 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.827824 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:39Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.831328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.831481 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.831608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.831708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.831785 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.843584 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:39Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.846887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.847032 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.847120 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.847227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.847364 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.858481 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:39Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:39 crc kubenswrapper[4669]: E1008 20:45:39.858619 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.859967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.860092 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.860165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.860232 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.860293 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.962066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.962100 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.962108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.962122 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:39 crc kubenswrapper[4669]: I1008 20:45:39.962132 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:39Z","lastTransitionTime":"2025-10-08T20:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.065131 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.065201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.065220 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.065248 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.065268 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.168618 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.168693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.168706 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.168730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.168756 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.272084 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.272141 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.272155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.272173 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.272188 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.330070 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.330144 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.330148 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:40 crc kubenswrapper[4669]: E1008 20:45:40.330255 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:40 crc kubenswrapper[4669]: E1008 20:45:40.330403 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:40 crc kubenswrapper[4669]: E1008 20:45:40.330490 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.375066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.375114 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.375124 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.375138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.375147 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.477095 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.477139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.477148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.477162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.477172 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.579782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.579862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.579886 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.579917 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.579943 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.683089 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.683190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.683209 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.683228 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.683243 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.786334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.786392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.786411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.786439 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.786457 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.889947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.889993 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.890002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.890020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.890030 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.992487 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.992554 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.992566 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.992581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:40 crc kubenswrapper[4669]: I1008 20:45:40.992590 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:40Z","lastTransitionTime":"2025-10-08T20:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.094999 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.095051 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.095063 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.095079 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.095090 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.198283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.198341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.198359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.198383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.198400 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.301636 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.301707 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.301730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.301757 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.301778 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.330467 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:41 crc kubenswrapper[4669]: E1008 20:45:41.330623 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.347518 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.366510 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.382764 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.397398 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.404752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.404806 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.404821 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.404838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.404851 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.413310 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.430962 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.449783 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.464283 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.485780 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.499001 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.506641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.506678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.506690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.506730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.506741 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.512598 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.524012 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.541250 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.553487 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.571987 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.584205 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.597355 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.608467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.608518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.608557 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.608579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.608597 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.609465 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:41Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.710365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.710412 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.710422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.710438 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.710452 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.814066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.814108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.814117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.814153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.814164 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.916648 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.916920 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.916992 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.917069 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:41 crc kubenswrapper[4669]: I1008 20:45:41.917139 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:41Z","lastTransitionTime":"2025-10-08T20:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.020157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.020374 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.020504 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.020624 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.020691 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.124138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.124208 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.124235 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.124265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.124287 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.227058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.227426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.227600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.227753 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.227900 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.329722 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.329769 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.329814 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:42 crc kubenswrapper[4669]: E1008 20:45:42.329847 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:42 crc kubenswrapper[4669]: E1008 20:45:42.329954 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:42 crc kubenswrapper[4669]: E1008 20:45:42.330092 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.331071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.331128 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.331148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.331173 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.331190 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.434066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.434119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.434135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.434157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.434174 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.537197 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.537248 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.537262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.537291 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.537308 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.640960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.641289 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.641416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.642002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.642679 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.745946 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.746016 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.746033 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.746061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.746077 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.849398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.849472 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.849505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.849566 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.849585 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.952558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.952850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.952995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.953099 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:42 crc kubenswrapper[4669]: I1008 20:45:42.953197 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:42Z","lastTransitionTime":"2025-10-08T20:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.056816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.056883 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.056904 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.056935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.056996 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.159551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.159580 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.159588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.159602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.159611 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.262957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.262994 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.263006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.263018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.263028 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.330572 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:43 crc kubenswrapper[4669]: E1008 20:45:43.330690 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.365189 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.365244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.365257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.365274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.365289 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.467828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.467916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.467933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.467956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.467973 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.570264 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.570307 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.570320 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.570338 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.570353 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.673006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.673050 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.673062 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.673080 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.673092 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.775425 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.775480 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.775489 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.775504 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.775515 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.878938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.879009 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.879023 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.879070 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.879088 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.981802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.981843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.981857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.981876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:43 crc kubenswrapper[4669]: I1008 20:45:43.981887 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:43Z","lastTransitionTime":"2025-10-08T20:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.084164 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.084210 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.084219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.084235 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.084244 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.187193 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.187250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.187263 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.187281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.187294 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.289424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.289454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.289463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.289476 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.289484 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.329930 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.330003 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:44 crc kubenswrapper[4669]: E1008 20:45:44.330081 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.330220 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:44 crc kubenswrapper[4669]: E1008 20:45:44.330270 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:44 crc kubenswrapper[4669]: E1008 20:45:44.330388 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.392882 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.393292 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.393309 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.393332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.393352 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.497334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.497388 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.497398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.497416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.497429 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.602501 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.602599 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.602617 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.602648 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.602664 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.705427 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.705473 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.705488 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.705508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.705522 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.808053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.808122 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.808152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.808185 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.808208 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.911139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.911603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.911873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.912108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:44 crc kubenswrapper[4669]: I1008 20:45:44.912341 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:44Z","lastTransitionTime":"2025-10-08T20:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.015473 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.015557 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.015573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.015592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.015604 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.118828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.118892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.118907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.118927 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.118940 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.226552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.226626 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.226639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.226662 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.226676 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.329547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.329596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.329611 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.329628 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.329643 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.329772 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:45 crc kubenswrapper[4669]: E1008 20:45:45.329906 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.434872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.434924 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.434935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.434954 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.434966 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.537106 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.537152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.537163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.537178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.537190 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.639705 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.639790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.639809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.639834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.639851 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.742836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.742867 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.742877 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.742892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.742901 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.845296 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.845325 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.845335 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.845350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.845361 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.948098 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.948161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.948182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.948212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:45 crc kubenswrapper[4669]: I1008 20:45:45.948236 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:45Z","lastTransitionTime":"2025-10-08T20:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.050851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.051098 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.051211 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.051305 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.051386 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.153851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.153899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.153915 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.153933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.153942 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.256185 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.256453 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.256568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.256653 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.256725 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.329971 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.330087 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.330356 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:46 crc kubenswrapper[4669]: E1008 20:45:46.330602 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:46 crc kubenswrapper[4669]: E1008 20:45:46.330787 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:46 crc kubenswrapper[4669]: E1008 20:45:46.331008 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.359517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.359835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.359923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.360022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.360113 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.462395 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.462455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.462474 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.462497 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.462516 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.565196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.565257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.565273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.565293 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.565308 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.667852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.667899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.667912 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.667931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.667944 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.770454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.770499 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.770514 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.770564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.770584 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.873678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.873739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.873758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.873783 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.873801 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.976351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.976417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.976435 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.976460 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:46 crc kubenswrapper[4669]: I1008 20:45:46.976477 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:46Z","lastTransitionTime":"2025-10-08T20:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.078566 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.078651 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.078670 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.078695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.078715 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.181422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.181477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.181496 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.181521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.181573 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.284462 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.284512 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.284561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.284586 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.284604 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.331770 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:47 crc kubenswrapper[4669]: E1008 20:45:47.331883 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.387058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.387087 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.387098 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.387112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.387122 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.489745 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.489790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.489829 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.489850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.489869 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.592496 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.592560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.592569 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.592585 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.592594 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.695064 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.695106 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.695119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.695137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.695150 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.799130 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.799172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.799182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.799196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.799206 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.901566 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.901634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.901647 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.901664 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:47 crc kubenswrapper[4669]: I1008 20:45:47.901677 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:47Z","lastTransitionTime":"2025-10-08T20:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.004838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.005154 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.005323 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.005490 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.005684 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.108697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.108782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.108809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.108838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.108858 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.211304 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.211339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.211351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.211370 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.211382 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.314428 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.314492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.314509 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.314563 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.314582 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.330601 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:48 crc kubenswrapper[4669]: E1008 20:45:48.330827 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.330685 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:48 crc kubenswrapper[4669]: E1008 20:45:48.331091 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.330633 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:48 crc kubenswrapper[4669]: E1008 20:45:48.331341 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.418346 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.418411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.418426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.418443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.418456 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.521204 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.521247 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.521256 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.521285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.521295 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.623077 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.623115 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.623126 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.623143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.623156 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.725869 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.725934 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.725957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.725985 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.726007 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.828357 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.828429 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.828447 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.828472 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.828489 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.931083 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.931143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.931163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.931188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:48 crc kubenswrapper[4669]: I1008 20:45:48.931205 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:48Z","lastTransitionTime":"2025-10-08T20:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.034155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.034212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.034230 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.034253 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.034270 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.137577 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.137625 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.137644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.137668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.137686 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.239874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.239918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.239935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.239956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.239971 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.330230 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:49 crc kubenswrapper[4669]: E1008 20:45:49.330893 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.331423 4669 scope.go:117] "RemoveContainer" containerID="6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7" Oct 08 20:45:49 crc kubenswrapper[4669]: E1008 20:45:49.331875 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.341381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.341419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.341430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.341444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.341455 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.444486 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.444522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.444545 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.444558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.444569 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.546566 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.546594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.546602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.546615 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.546628 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.650172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.650222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.650239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.650263 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.650279 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.658556 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:49 crc kubenswrapper[4669]: E1008 20:45:49.658674 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:49 crc kubenswrapper[4669]: E1008 20:45:49.658738 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:46:21.658718537 +0000 UTC m=+101.351529210 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.752472 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.752549 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.752559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.752575 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.752588 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.855162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.855196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.855205 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.855218 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.855229 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.956173 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.956223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.956239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.956256 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.956270 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: E1008 20:45:49.970449 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:49Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.974304 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.974339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.974350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.974365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.974379 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:49 crc kubenswrapper[4669]: E1008 20:45:49.993164 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:49Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.997295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.997324 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.997336 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.997350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:49 crc kubenswrapper[4669]: I1008 20:45:49.997360 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:49Z","lastTransitionTime":"2025-10-08T20:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.014287 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:50Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.018112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.018140 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.018159 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.018178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.018189 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.031750 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:50Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.035193 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.035222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.035230 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.035245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.035255 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.052311 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:50Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.052562 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.054283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.054326 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.054342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.054364 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.054381 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.156827 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.156872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.156884 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.156901 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.156920 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.259494 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.259557 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.259569 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.259587 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.259619 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.330247 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.330335 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.330417 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.330575 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.330939 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:50 crc kubenswrapper[4669]: E1008 20:45:50.331179 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.362612 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.362957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.363093 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.363181 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.363256 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.466775 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.466828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.466838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.466853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.466865 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.569105 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.569138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.569147 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.569160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.569169 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.672653 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.672728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.672743 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.672761 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.672774 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.775636 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.775681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.775697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.775719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.775737 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.878643 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.878675 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.878688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.878702 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.878714 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.980851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.980891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.980902 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.980924 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:50 crc kubenswrapper[4669]: I1008 20:45:50.980936 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:50Z","lastTransitionTime":"2025-10-08T20:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.084310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.084351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.084362 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.084382 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.084392 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.187471 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.187511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.187520 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.187552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.187562 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.289618 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.289687 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.289702 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.289718 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.289729 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.330412 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:51 crc kubenswrapper[4669]: E1008 20:45:51.330605 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.344066 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.357657 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.368468 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.380910 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.391961 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.392003 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.392016 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.392033 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.392047 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.394274 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.405200 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.426430 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.438012 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.448172 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.458542 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.470122 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.478118 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.487899 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.494033 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.494071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.494083 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.494123 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.494134 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.501337 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.513352 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.525884 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.543681 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.561737 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.596160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.596214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.596229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.596250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.596265 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.698805 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.698866 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.698884 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.698908 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.698925 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.742173 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/0.log" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.742224 4669 generic.go:334] "Generic (PLEG): container finished" podID="2433400c-98f8-490f-a566-00a330a738fe" containerID="863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee" exitCode=1 Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.742257 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerDied","Data":"863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.742700 4669 scope.go:117] "RemoveContainer" containerID="863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.757394 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.771062 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.790415 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.801470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.801509 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.801537 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.801553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.801586 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.811033 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.821014 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.838830 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.854263 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.869900 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.883638 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"2025-10-08T20:45:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf\\\\n2025-10-08T20:45:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf to /host/opt/cni/bin/\\\\n2025-10-08T20:45:06Z [verbose] multus-daemon started\\\\n2025-10-08T20:45:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T20:45:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.897824 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.904252 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.904300 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.904314 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.904332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.904346 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:51Z","lastTransitionTime":"2025-10-08T20:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.910738 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.921252 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.931432 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.943486 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.956592 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.967602 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.978048 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:51 crc kubenswrapper[4669]: I1008 20:45:51.994690 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:51Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.006207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.006231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.006239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.006275 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.006285 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.109032 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.109069 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.109079 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.109093 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.109103 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.210910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.210949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.210960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.210975 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.210987 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.312817 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.312859 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.312871 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.312888 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.312898 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.330359 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.330398 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.330444 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:52 crc kubenswrapper[4669]: E1008 20:45:52.330465 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:52 crc kubenswrapper[4669]: E1008 20:45:52.330567 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:52 crc kubenswrapper[4669]: E1008 20:45:52.330692 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.421757 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.422082 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.422096 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.422112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.422125 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.525000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.525059 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.525070 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.525095 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.525109 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.627934 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.627976 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.627988 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.628006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.628017 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.730408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.730465 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.730517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.730557 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.730571 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.746338 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/0.log" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.746410 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerStarted","Data":"2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.765386 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.780984 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.795728 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.815595 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.829607 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.833759 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.833840 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.833857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.833878 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.833891 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.841451 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.856365 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.868152 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.881054 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.894888 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.906555 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"2025-10-08T20:45:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf\\\\n2025-10-08T20:45:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf to /host/opt/cni/bin/\\\\n2025-10-08T20:45:06Z [verbose] multus-daemon started\\\\n2025-10-08T20:45:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T20:45:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.926795 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.936206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.936265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.936277 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.936315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.936330 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:52Z","lastTransitionTime":"2025-10-08T20:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.938933 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.949482 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.961052 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.971264 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.983484 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:52 crc kubenswrapper[4669]: I1008 20:45:52.994892 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:45:52Z is after 2025-08-24T17:21:41Z" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.038889 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.038928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.038936 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.038950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.038960 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.141001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.141060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.141069 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.141083 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.141092 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.243488 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.243563 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.243576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.243597 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.243609 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.330871 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:53 crc kubenswrapper[4669]: E1008 20:45:53.331116 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.345562 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.345608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.345621 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.345635 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.345646 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.448490 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.448568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.448579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.448596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.448612 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.551103 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.551131 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.551139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.551152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.551161 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.653967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.654014 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.654027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.654046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.654059 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.755792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.755860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.755877 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.755900 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.755918 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.858488 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.858566 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.858578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.858595 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.858609 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.961552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.961593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.961603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.961620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:53 crc kubenswrapper[4669]: I1008 20:45:53.961632 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:53Z","lastTransitionTime":"2025-10-08T20:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.064835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.064877 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.064885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.064899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.064909 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.167257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.167288 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.167299 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.167313 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.167324 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.270177 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.270227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.270236 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.270251 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.270260 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.330009 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.330058 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.330125 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:54 crc kubenswrapper[4669]: E1008 20:45:54.330127 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:54 crc kubenswrapper[4669]: E1008 20:45:54.330337 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:54 crc kubenswrapper[4669]: E1008 20:45:54.330433 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.372408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.372441 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.372450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.372464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.372475 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.474680 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.474704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.474713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.474726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.474735 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.576823 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.576864 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.576875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.576891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.576903 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.679571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.679602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.679613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.679629 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.679641 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.782003 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.782240 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.782405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.782632 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.782800 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.885550 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.885604 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.885616 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.885633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.885646 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.988710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.988765 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.988775 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.988790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:54 crc kubenswrapper[4669]: I1008 20:45:54.988801 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:54Z","lastTransitionTime":"2025-10-08T20:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.091895 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.091935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.091944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.091960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.091969 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.195488 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.195569 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.195586 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.195609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.195627 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.298328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.298377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.298390 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.298409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.298422 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.330095 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:55 crc kubenswrapper[4669]: E1008 20:45:55.330349 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.401287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.401336 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.401351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.401371 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.401385 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.504018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.504067 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.504078 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.504097 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.504107 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.606798 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.606841 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.606853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.606869 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.606881 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.710199 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.710245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.710261 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.710285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.710300 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.812785 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.812831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.812840 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.812860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.812872 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.915758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.915805 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.915817 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.915835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:55 crc kubenswrapper[4669]: I1008 20:45:55.915848 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:55Z","lastTransitionTime":"2025-10-08T20:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.018232 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.018265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.018273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.018286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.018295 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.120198 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.120233 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.120249 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.120264 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.120275 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.223684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.223721 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.223732 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.223747 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.223759 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.326142 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.326194 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.326205 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.326222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.326235 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.330407 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:56 crc kubenswrapper[4669]: E1008 20:45:56.330568 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.330678 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.330732 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:56 crc kubenswrapper[4669]: E1008 20:45:56.331038 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:56 crc kubenswrapper[4669]: E1008 20:45:56.331160 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.428993 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.429066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.429090 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.429120 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.429141 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.532885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.532945 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.532957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.532974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.532983 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.635623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.635667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.635685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.635706 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.635722 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.738066 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.738099 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.738107 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.738119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.738129 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.841098 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.841145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.841155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.841171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.841183 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.943904 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.943973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.943989 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.944012 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:56 crc kubenswrapper[4669]: I1008 20:45:56.944024 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:56Z","lastTransitionTime":"2025-10-08T20:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.046068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.046102 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.046110 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.046122 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.046131 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.149978 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.150042 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.150067 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.150096 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.150120 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.253792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.253856 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.253875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.253898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.253915 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.331031 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:57 crc kubenswrapper[4669]: E1008 20:45:57.331295 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.355686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.355739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.355751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.355768 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.355782 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.458522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.458612 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.458630 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.458655 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.458673 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.561677 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.561714 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.561722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.561735 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.561743 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.664006 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.664055 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.664067 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.664085 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.664099 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.766414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.766448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.766456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.766470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.766479 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.868983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.869058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.869080 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.869103 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.869120 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.971951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.971997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.972012 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.972033 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:57 crc kubenswrapper[4669]: I1008 20:45:57.972049 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:57Z","lastTransitionTime":"2025-10-08T20:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.074874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.074926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.074942 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.074962 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.074978 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.177426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.177467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.177479 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.177494 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.177507 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.279943 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.280022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.280045 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.280076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.280094 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.330092 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.330181 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:45:58 crc kubenswrapper[4669]: E1008 20:45:58.330216 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.330194 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:45:58 crc kubenswrapper[4669]: E1008 20:45:58.330315 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:45:58 crc kubenswrapper[4669]: E1008 20:45:58.330480 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.382436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.382483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.382492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.382507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.382545 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.484519 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.484571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.484582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.484595 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.484607 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.587950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.588015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.588038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.588067 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.588090 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.691259 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.691294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.691302 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.691316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.691325 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.793860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.793920 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.793940 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.793965 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.793986 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.897168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.897231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.897248 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.897270 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:58 crc kubenswrapper[4669]: I1008 20:45:58.897322 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:58Z","lastTransitionTime":"2025-10-08T20:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.000397 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.000454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.000470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.000500 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.000516 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.103241 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.103310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.103333 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.103361 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.103382 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.206421 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.206475 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.206493 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.206518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.206595 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.310091 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.310421 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.310650 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.310821 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.310973 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.330803 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:45:59 crc kubenswrapper[4669]: E1008 20:45:59.331159 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.414162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.414200 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.414210 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.414223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.414234 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.517588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.517681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.517712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.517741 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.517884 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.621497 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.621636 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.621697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.621733 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.621758 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.725470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.725598 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.725948 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.726293 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.726361 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.829369 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.829415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.829425 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.829443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.829455 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.931903 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.931956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.931969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.931987 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:45:59 crc kubenswrapper[4669]: I1008 20:45:59.931999 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:45:59Z","lastTransitionTime":"2025-10-08T20:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.034991 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.035050 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.035068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.035091 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.035121 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.138457 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.138576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.138614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.138646 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.138667 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.242414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.242520 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.242577 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.242607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.242631 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.278198 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.278249 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.278262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.278279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.278292 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.294481 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:00Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.299316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.299359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.299374 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.299391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.299405 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.314433 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:00Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.318561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.318600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.318637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.318657 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.318669 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.330439 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.330439 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.330594 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.330690 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.330453 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.330773 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.333201 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:00Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.337929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.337982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.338001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.338023 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.338074 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.352665 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:00Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.356904 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.356938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.356949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.356966 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.356979 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.371762 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T20:46:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf950064-edbb-4bec-8a75-ab8d963fcdb3\\\",\\\"systemUUID\\\":\\\"527fa759-e25f-4fb3-8304-f30dbff0c847\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:00Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:00 crc kubenswrapper[4669]: E1008 20:46:00.371929 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.374099 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.374299 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.374314 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.374338 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.374357 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.477349 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.477424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.477448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.477477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.477501 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.580223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.580320 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.580392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.580420 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.581300 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.684294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.684355 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.684375 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.684397 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.684415 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.786468 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.786510 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.786521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.786572 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.786585 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.889458 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.889525 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.889581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.889606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.889636 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.991653 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.991690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.991699 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.991713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:00 crc kubenswrapper[4669]: I1008 20:46:00.991724 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:00Z","lastTransitionTime":"2025-10-08T20:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.093371 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.093446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.093462 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.093484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.093501 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.196627 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.196684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.196695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.196711 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.196722 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.299115 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.299176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.299188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.299207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.299221 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.329869 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:01 crc kubenswrapper[4669]: E1008 20:46:01.330022 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.353866 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.369325 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.383843 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.398304 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.402140 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.402216 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.402239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.402269 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.402293 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.421121 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.439162 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.458760 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.473837 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.490306 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.504958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.504999 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.505008 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.505022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.505032 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.506193 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.520015 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"2025-10-08T20:45:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf\\\\n2025-10-08T20:45:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf to /host/opt/cni/bin/\\\\n2025-10-08T20:45:06Z [verbose] multus-daemon started\\\\n2025-10-08T20:45:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T20:45:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.538143 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.546954 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.556627 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.569423 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.579073 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.590461 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.601096 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:01Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.607697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.607730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.607739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.607751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.607761 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.709992 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.710038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.710049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.710065 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.710076 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.812478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.812524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.812590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.812608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.812618 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.914995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.915046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.915063 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.915086 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:01 crc kubenswrapper[4669]: I1008 20:46:01.915103 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:01Z","lastTransitionTime":"2025-10-08T20:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.018316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.018355 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.018366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.018381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.018391 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.121183 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.121229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.121239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.121254 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.121265 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.223791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.223880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.223897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.223914 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.223925 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.326931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.326982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.326995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.327007 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.327016 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.330495 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:02 crc kubenswrapper[4669]: E1008 20:46:02.330665 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.330737 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.330739 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:02 crc kubenswrapper[4669]: E1008 20:46:02.330945 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:02 crc kubenswrapper[4669]: E1008 20:46:02.331063 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.429399 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.429442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.429452 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.429468 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.429478 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.531761 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.531864 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.531882 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.531907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.531926 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.634691 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.634754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.634771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.634795 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.634811 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.737737 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.737776 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.737789 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.737808 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.737821 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.840599 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.840648 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.840660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.840678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.840691 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.943310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.943355 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.943366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.943383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:02 crc kubenswrapper[4669]: I1008 20:46:02.943395 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:02Z","lastTransitionTime":"2025-10-08T20:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.045743 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.045799 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.045815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.045850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.045886 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.147996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.148041 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.148052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.148067 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.148080 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.251521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.251634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.251648 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.251666 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.251679 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.330351 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:03 crc kubenswrapper[4669]: E1008 20:46:03.330594 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.358983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.359042 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.359058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.359076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.359087 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.461887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.461935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.461949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.461971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.461987 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.564836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.564893 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.564910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.564941 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.564974 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.667467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.667514 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.667532 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.667583 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.667601 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.771090 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.771121 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.771137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.771151 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.771161 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.873398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.873446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.873459 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.873477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.873490 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.976734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.976784 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.976804 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.976822 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:03 crc kubenswrapper[4669]: I1008 20:46:03.976838 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:03Z","lastTransitionTime":"2025-10-08T20:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.079624 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.079688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.079706 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.079730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.079746 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.182594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.182774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.182809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.182835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.182856 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.285639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.285698 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.285711 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.285728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.285740 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.329995 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.330000 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.330124 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:04 crc kubenswrapper[4669]: E1008 20:46:04.330254 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:04 crc kubenswrapper[4669]: E1008 20:46:04.330354 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:04 crc kubenswrapper[4669]: E1008 20:46:04.330395 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.331136 4669 scope.go:117] "RemoveContainer" containerID="6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.388310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.388347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.388358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.388374 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.388386 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.490249 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.490524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.490641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.490731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.490814 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.593937 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.594206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.594394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.594683 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.594867 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.696960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.697001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.697013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.697030 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.697042 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.791050 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/2.log" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.793742 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.794088 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.798845 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.798907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.798944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.798967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.798984 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.813881 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.828712 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.844154 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.858963 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.874272 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.887995 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.901143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.901190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.901202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.901220 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.901234 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:04Z","lastTransitionTime":"2025-10-08T20:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.904700 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.926605 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.938294 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.949858 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.960101 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.974440 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.982950 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:04 crc kubenswrapper[4669]: I1008 20:46:04.992651 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:04Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.003137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.003170 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.003179 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.003219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.003230 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.009807 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.020349 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.032164 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.045431 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"2025-10-08T20:45:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf\\\\n2025-10-08T20:45:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf to /host/opt/cni/bin/\\\\n2025-10-08T20:45:06Z [verbose] multus-daemon started\\\\n2025-10-08T20:45:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T20:45:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.105917 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.105980 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.105998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.106022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.106041 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.208422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.208450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.208478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.208491 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.208501 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.311068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.311109 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.311125 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.311145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.311157 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.330321 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:05 crc kubenswrapper[4669]: E1008 20:46:05.330453 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.413944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.413996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.414011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.414026 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.414037 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.516720 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.516767 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.516777 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.516791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.516803 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.619600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.619673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.619690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.619713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.619730 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.722411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.722492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.722519 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.722646 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.722673 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.800997 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/3.log" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.801786 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/2.log" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.806211 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" exitCode=1 Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.806267 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.806319 4669 scope.go:117] "RemoveContainer" containerID="6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.809434 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:46:05 crc kubenswrapper[4669]: E1008 20:46:05.809834 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.825405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.825663 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.825760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.825846 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.825924 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.826958 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.850694 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.861957 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.881340 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.894051 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.909972 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.924078 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.927937 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.927974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.927986 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.928003 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.928016 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:05Z","lastTransitionTime":"2025-10-08T20:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.937047 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"2025-10-08T20:45:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf\\\\n2025-10-08T20:45:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf to /host/opt/cni/bin/\\\\n2025-10-08T20:45:06Z [verbose] multus-daemon started\\\\n2025-10-08T20:45:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T20:45:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.954930 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.965116 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.977695 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:05 crc kubenswrapper[4669]: I1008 20:46:05.989937 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.001852 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.014500 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.028891 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.030363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.030390 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.030398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.030413 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.030434 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.060880 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b2057662a5102fd118da847c6e54ac823bfe6b76443d347d2b60a7c8728d6d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:33Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:45:33.198483 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1008 20:45:33.198486 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-hw2kf\\\\nF1008 20:45:33.198495 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:46:05Z\\\",\\\"message\\\":\\\"fe46cb89-4e54-4175-a112-1c5224cd299e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:46:05.261472 6775 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-flswm\\\\nI1008 20:46:05.261486 6775 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1008 20:46:05.261493 6775 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nF1008 20:46:05.261500 6775 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:46:05.261505 6775 obj_retry\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.076194 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.091313 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.133025 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.133059 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.133072 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.133093 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.133105 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.149574 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.149721 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:10.149700257 +0000 UTC m=+149.842510930 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.236156 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.236400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.236408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.236421 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.236431 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.250963 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.251009 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.251075 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.251129 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251133 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251186 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251215 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251233 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251288 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 20:47:10.251268863 +0000 UTC m=+149.944079546 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251192 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251295 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251347 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:47:10.251330904 +0000 UTC m=+149.944141597 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251192 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251384 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251402 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 20:47:10.251376516 +0000 UTC m=+149.944187219 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.251422 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 20:47:10.251412307 +0000 UTC m=+149.944222980 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.330135 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.330150 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.330178 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.330505 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.330714 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.330753 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.339634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.339670 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.339681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.339698 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.339710 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.342826 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.442907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.442953 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.442969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.442994 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.443013 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.546588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.546658 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.546680 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.546704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.546722 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.650018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.650083 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.650101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.650126 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.650145 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.752162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.752205 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.752217 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.752234 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.752250 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.815856 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/3.log" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.819953 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:46:06 crc kubenswrapper[4669]: E1008 20:46:06.820139 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.839630 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-klx9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2433400c-98f8-490f-a566-00a330a738fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:45:51Z\\\",\\\"message\\\":\\\"2025-10-08T20:45:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf\\\\n2025-10-08T20:45:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3972dbd-537f-4d3b-a2ea-f565277a9edf to /host/opt/cni/bin/\\\\n2025-10-08T20:45:06Z [verbose] multus-daemon started\\\\n2025-10-08T20:45:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T20:45:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-klx9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.851953 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17fccb59-3613-4bc1-92b7-872801c9b793\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dea36f2d24de5cd6ea4ecea981514af7ab8f6b33ec55678b30d000ee19e113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c68ec709284c53e463217c42296375acbe49c5cc9f3d9248f85bf59e1fe55f5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c68ec709284c53e463217c42296375acbe49c5cc9f3d9248f85bf59e1fe55f5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.854408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.854438 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.854448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.854490 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.854501 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.870724 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0319f7-8ee3-4392-a36a-419161391db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7f52f3d22574d0a01cdfd7b7a40caf1a6cf201dc719e35f40eae85a071286f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3064f5dde5317ed6c1dba4ecdcf4da81c2451262d83e3e2826c6ebbfe1487ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13697e6470d481451982948653db44d08baa70466d010442534eaa249e58bae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c88256a72c667695563af6b37d01d958621c1ca6cbdaf474364bd6c8128c4409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6efb0bccc51deff2303655e7a8d3a6261a8b3c9425f6d94120cd1acf27fd7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c251abf013671bd264c1fba17cc854f3325ab4399dbbc1d270841c4399c6645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1e724ca5f5c1180c5327dbf31c2f38004ec7eeadbccf5d5dff7d714294b7488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92500bf30e66fe5d40579c88a7644dc4df723b9774580a5f8ddc100358df8992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.885375 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.898355 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c2b44fd8fb3c01bbc8a1b2f5a3507af28b2aa79a3d6ab8e7de3945bbfd01e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a472679f03ab86aa0a31a2ff3affe48d8e289a76db949bcc6ea10446fd08fdd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.911662 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac60c10-afa3-424e-9aa2-060e32f4a40f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91277ee733ac8aee89b1a7716b6dcebf57e7d24e5cab5615d88ac8ff90f6f5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e8bd9bc559c9623c06aa2f0324a6679f5d241e881db918904f3e1e97d56a20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbw65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bl6pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.924117 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f90eed21-8bc2-4723-b6be-a672669a36fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bh59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ml9vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.936969 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.951299 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c397b74921593a42fb7626e545778d80c506f0bbce7bc425b75c77a222c770e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.957288 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.957331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.957342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.957359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.957371 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:06Z","lastTransitionTime":"2025-10-08T20:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.962130 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zcf2d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a016bee1-2c29-46bb-b3b8-841c4a65e162\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa5b9befc8fb3a83cb6dd6097014bfe9fd0b905b4bf8fbdcccd4fdfb62ab410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flsl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zcf2d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.972109 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39c9bcf2-9580-4534-8c7e-886bd4aff469\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d8b81cfea1e9e0c9b30427e8b8cb07b463c6ef45afb8379aa006d71bccd82a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vwq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw2kf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.985332 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d080d327-7e4d-41af-aa15-0ce849523815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8127834da98ef46a594a74cbfcc6ef779b8429046327546560b7b37085572c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f61c1793e6c95085b6964298f29b5f896451784046a6aee1c73bbda234a3bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd3bc937fc2e56c3d332e4d3822a2749d040c57cd94f6e3bcdcfd83c126bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33aff5ef2ae82f810d3b3e66effb80087fa92081419227e4fb66a6aa80468ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8929b5321fd8e458ef9f43ab2fb595e1f7a2c5bb62d91cc2b552626446b6edec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"W1008 20:44:56.871605 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 20:44:56.872089 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759956296 cert, and key in /tmp/serving-cert-3424828285/serving-signer.crt, /tmp/serving-cert-3424828285/serving-signer.key\\\\nI1008 20:44:57.365674 1 observer_polling.go:159] Starting file observer\\\\nW1008 20:45:02.381062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1008 20:45:02.381192 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 20:45:02.381876 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3424828285/tls.crt::/tmp/serving-cert-3424828285/tls.key\\\\\\\"\\\\nI1008 20:45:02.718633 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 20:45:02.726325 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 20:45:02.726358 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 20:45:02.726380 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 20:45:02.726384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 20:45:02.731456 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 20:45:02.731985 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 20:45:02.731867 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 20:45:02.733228 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d213380e32b3db218facfef313963d26689d2f0871d2a004a63380454fac8a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78fcc78aa28787ec306acdc8f15faee1955193823181bd6e9a3cc6aee8144d14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:06 crc kubenswrapper[4669]: I1008 20:46:06.998578 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c0e064d976a7c307fd13ec11ae76672cc1225b71a616f171626ee1f9a24531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:06Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.015845 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T20:46:05Z\\\",\\\"message\\\":\\\"fe46cb89-4e54-4175-a112-1c5224cd299e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 20:46:05.261472 6775 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-flswm\\\\nI1008 20:46:05.261486 6775 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1008 20:46:05.261493 6775 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nF1008 20:46:05.261500 6775 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 20:46:05.261505 6775 obj_retry\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T20:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zqxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpzdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.025487 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-flswm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"609156f9-39b1-4330-83a2-eabf82f4228f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749545f8f4b6269a70b747fee79dc8d419b62054f507b0d819b63aa68c44bb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-flswm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.039407 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b822af4b-b157-4b05-9af4-7798315f365f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d615b49ade5de43393d40344c1b71733acedb541841b3ec34d6dd293e62f96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07c33fe9c40fb9b53e940940c3fe2b8b63a94b0f867aa804d215cb3ba90d01c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569ba2e70b947eea1e531ab7e8f1ac2e3441ade593dd48910407df766217d87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f2d8af11793121a84b4559833f410bd59a8bb122d88da0d3b55d7dcbbf57a9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.055596 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc1b4218-68c0-4c48-a495-f8539e06d444\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524972a79ac73180ccc655f37054721fb478bf263c711e814c9b49cc4f1a76ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e4a88a6084b96798f461c3427f491577d74e6da859263a8c59545395cf029a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3133419a0643dc2be6a13a30c87b23a13965c59841d991db7ec80d5e53ca2840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8abb50fc1491bed6db7f23e79900b0223d3741a87a9a5545c144252a077353b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:44:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:44:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:44:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.058967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.059015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.059024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.059039 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.059048 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.068966 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.082596 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d00f76-e9e7-4a09-9be0-7ad67d4e8c0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T20:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09be2dce24bba0d88a36f2d85e6280e6806f9b6cf59ec3950513e976c97429e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea99045d738708978a1191d784c5b881295f87b519e23dfddc2ade3b324d600\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d546813e55b19da89cbd4a50f07dfb6de240a2c264124ff860084606573cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6491c155bc9da43adbae94cf6a3b5da34b0784370c7f56b83ceced6915c73fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d12ef6b187a6c362f426740325c5a2155450c319ee2c1242bc2ee81c1f4da7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bb33a1a6add0171cabb1e71f902052b0c731c0f5663843a50f71330c8bd87e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762c6a4fc7dc87a00466a43fe913c2744ed10c25e41db737716a11a0874c2758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T20:45:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T20:45:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jwbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T20:45:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bfcvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T20:46:07Z is after 2025-08-24T17:21:41Z" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.160755 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.160800 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.160811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.160823 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.160833 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.262692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.262937 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.263011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.263084 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.263182 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.329860 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:07 crc kubenswrapper[4669]: E1008 20:46:07.330269 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.365779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.365826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.365837 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.365855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.365868 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.468774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.469018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.469096 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.469172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.469254 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.572354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.572388 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.572398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.572412 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.572423 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.674797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.674832 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.674843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.674859 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.674869 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.777257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.777310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.777331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.777353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.777371 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.879986 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.880295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.880363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.880450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.880523 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.984000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.984084 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.984107 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.984134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:07 crc kubenswrapper[4669]: I1008 20:46:07.984154 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:07Z","lastTransitionTime":"2025-10-08T20:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.087588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.087665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.087688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.087718 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.087741 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.191024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.191337 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.191423 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.191506 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.191619 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.294301 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.294612 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.294692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.294768 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.294935 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.330329 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.330386 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:08 crc kubenswrapper[4669]: E1008 20:46:08.330797 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.330430 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:08 crc kubenswrapper[4669]: E1008 20:46:08.330917 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:08 crc kubenswrapper[4669]: E1008 20:46:08.331326 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.397734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.397792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.397802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.397821 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.397835 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.501152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.501232 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.501260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.501291 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.501313 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.604645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.604708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.604727 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.604752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.604771 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.707933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.708028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.708046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.708072 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.708089 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.811185 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.811256 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.811273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.811298 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.811316 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.913928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.913969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.913981 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.913998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:08 crc kubenswrapper[4669]: I1008 20:46:08.914009 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:08Z","lastTransitionTime":"2025-10-08T20:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.016506 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.016603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.016619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.016640 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.016653 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.119768 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.119855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.119872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.119895 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.119912 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.223076 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.223108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.223120 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.223138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.223153 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.325205 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.325236 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.325245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.325258 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.325267 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.329765 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:09 crc kubenswrapper[4669]: E1008 20:46:09.329864 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.427791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.427844 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.427857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.427875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.427888 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.530596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.530669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.530686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.530702 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.530714 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.633513 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.633609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.633637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.633667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.633692 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.737042 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.737126 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.737152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.737183 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.737207 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.839477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.839521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.839554 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.839569 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.839581 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.942634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.942713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.942739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.942769 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:09 crc kubenswrapper[4669]: I1008 20:46:09.942794 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:09Z","lastTransitionTime":"2025-10-08T20:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.045755 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.045818 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.045840 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.045896 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.045909 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.148892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.149696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.149734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.149761 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.149778 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.253112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.253180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.253202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.253232 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.253255 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.330076 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.330128 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.330076 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:10 crc kubenswrapper[4669]: E1008 20:46:10.330315 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:10 crc kubenswrapper[4669]: E1008 20:46:10.330425 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:10 crc kubenswrapper[4669]: E1008 20:46:10.330567 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.355564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.355620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.355642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.355660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.355673 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.457947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.457990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.458000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.458015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.458027 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.561264 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.561332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.561350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.561375 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.561434 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.637343 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.637402 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.637420 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.637443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.637466 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.669782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.669836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.669853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.669885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.669902 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T20:46:10Z","lastTransitionTime":"2025-10-08T20:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.704531 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k"] Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.705193 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.709151 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.709165 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.710089 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.710340 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.748344 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-flswm" podStartSLOduration=68.748315195 podStartE2EDuration="1m8.748315195s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.728105691 +0000 UTC m=+90.420916404" watchObservedRunningTime="2025-10-08 20:46:10.748315195 +0000 UTC m=+90.441125908" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.773254 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.77323158 podStartE2EDuration="1m8.77323158s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.749021974 +0000 UTC m=+90.441832687" watchObservedRunningTime="2025-10-08 20:46:10.77323158 +0000 UTC m=+90.466042323" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.773602 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.773592539 podStartE2EDuration="36.773592539s" podCreationTimestamp="2025-10-08 20:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.773496107 +0000 UTC m=+90.466306820" watchObservedRunningTime="2025-10-08 20:46:10.773592539 +0000 UTC m=+90.466403222" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.799117 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ded9ac9a-6a76-4221-835d-4fbc9b114f53-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.799205 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ded9ac9a-6a76-4221-835d-4fbc9b114f53-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.799281 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ded9ac9a-6a76-4221-835d-4fbc9b114f53-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.799334 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ded9ac9a-6a76-4221-835d-4fbc9b114f53-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.799467 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ded9ac9a-6a76-4221-835d-4fbc9b114f53-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.849012 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bfcvh" podStartSLOduration=68.848983249 podStartE2EDuration="1m8.848983249s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.81641391 +0000 UTC m=+90.509224613" watchObservedRunningTime="2025-10-08 20:46:10.848983249 +0000 UTC m=+90.541793942" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.869977 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.869953003 podStartE2EDuration="4.869953003s" podCreationTimestamp="2025-10-08 20:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.869281076 +0000 UTC m=+90.562091749" watchObservedRunningTime="2025-10-08 20:46:10.869953003 +0000 UTC m=+90.562763696" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.870187 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-klx9r" podStartSLOduration=68.870179439 podStartE2EDuration="1m8.870179439s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.849575945 +0000 UTC m=+90.542386648" watchObservedRunningTime="2025-10-08 20:46:10.870179439 +0000 UTC m=+90.562990132" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.895591 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.895574055 podStartE2EDuration="1m8.895574055s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.894843677 +0000 UTC m=+90.587654360" watchObservedRunningTime="2025-10-08 20:46:10.895574055 +0000 UTC m=+90.588384728" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900676 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ded9ac9a-6a76-4221-835d-4fbc9b114f53-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ded9ac9a-6a76-4221-835d-4fbc9b114f53-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900753 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ded9ac9a-6a76-4221-835d-4fbc9b114f53-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900810 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ded9ac9a-6a76-4221-835d-4fbc9b114f53-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900828 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ded9ac9a-6a76-4221-835d-4fbc9b114f53-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900876 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ded9ac9a-6a76-4221-835d-4fbc9b114f53-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.900933 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ded9ac9a-6a76-4221-835d-4fbc9b114f53-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.901869 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ded9ac9a-6a76-4221-835d-4fbc9b114f53-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.906288 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ded9ac9a-6a76-4221-835d-4fbc9b114f53-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.920220 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ded9ac9a-6a76-4221-835d-4fbc9b114f53-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8n94k\" (UID: \"ded9ac9a-6a76-4221-835d-4fbc9b114f53\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.930568 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bl6pv" podStartSLOduration=67.930519025 podStartE2EDuration="1m7.930519025s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.93034357 +0000 UTC m=+90.623154243" watchObservedRunningTime="2025-10-08 20:46:10.930519025 +0000 UTC m=+90.623329708" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.980178 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zcf2d" podStartSLOduration=68.980162649 podStartE2EDuration="1m8.980162649s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.97940596 +0000 UTC m=+90.672216633" watchObservedRunningTime="2025-10-08 20:46:10.980162649 +0000 UTC m=+90.672973322" Oct 08 20:46:10 crc kubenswrapper[4669]: I1008 20:46:10.993765 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podStartSLOduration=68.993741215 podStartE2EDuration="1m8.993741215s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:10.993040557 +0000 UTC m=+90.685851270" watchObservedRunningTime="2025-10-08 20:46:10.993741215 +0000 UTC m=+90.686551908" Oct 08 20:46:11 crc kubenswrapper[4669]: I1008 20:46:11.022090 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.022065456 podStartE2EDuration="1m8.022065456s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:11.009461176 +0000 UTC m=+90.702271859" watchObservedRunningTime="2025-10-08 20:46:11.022065456 +0000 UTC m=+90.714876129" Oct 08 20:46:11 crc kubenswrapper[4669]: I1008 20:46:11.028365 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" Oct 08 20:46:11 crc kubenswrapper[4669]: W1008 20:46:11.040455 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podded9ac9a_6a76_4221_835d_4fbc9b114f53.slice/crio-1d60c5318243ea14a2e825124ebcd519722753eb348d1c7da3f1721bbaa9df5d WatchSource:0}: Error finding container 1d60c5318243ea14a2e825124ebcd519722753eb348d1c7da3f1721bbaa9df5d: Status 404 returned error can't find the container with id 1d60c5318243ea14a2e825124ebcd519722753eb348d1c7da3f1721bbaa9df5d Oct 08 20:46:11 crc kubenswrapper[4669]: I1008 20:46:11.330598 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:11 crc kubenswrapper[4669]: E1008 20:46:11.332103 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:11 crc kubenswrapper[4669]: I1008 20:46:11.838508 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" event={"ID":"ded9ac9a-6a76-4221-835d-4fbc9b114f53","Type":"ContainerStarted","Data":"a5b52aa36660a9a0ac53abb690de019493bca461a9c344bc5bdc8170e29b37f2"} Oct 08 20:46:11 crc kubenswrapper[4669]: I1008 20:46:11.838586 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" event={"ID":"ded9ac9a-6a76-4221-835d-4fbc9b114f53","Type":"ContainerStarted","Data":"1d60c5318243ea14a2e825124ebcd519722753eb348d1c7da3f1721bbaa9df5d"} Oct 08 20:46:11 crc kubenswrapper[4669]: I1008 20:46:11.857352 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8n94k" podStartSLOduration=69.857327113 podStartE2EDuration="1m9.857327113s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:11.855273071 +0000 UTC m=+91.548083754" watchObservedRunningTime="2025-10-08 20:46:11.857327113 +0000 UTC m=+91.550137826" Oct 08 20:46:12 crc kubenswrapper[4669]: I1008 20:46:12.330129 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:12 crc kubenswrapper[4669]: I1008 20:46:12.330140 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:12 crc kubenswrapper[4669]: E1008 20:46:12.330311 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:12 crc kubenswrapper[4669]: E1008 20:46:12.330391 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:12 crc kubenswrapper[4669]: I1008 20:46:12.330158 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:12 crc kubenswrapper[4669]: E1008 20:46:12.330555 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:13 crc kubenswrapper[4669]: I1008 20:46:13.330032 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:13 crc kubenswrapper[4669]: E1008 20:46:13.330255 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:14 crc kubenswrapper[4669]: I1008 20:46:14.330416 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:14 crc kubenswrapper[4669]: I1008 20:46:14.330561 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:14 crc kubenswrapper[4669]: E1008 20:46:14.330591 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:14 crc kubenswrapper[4669]: I1008 20:46:14.330416 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:14 crc kubenswrapper[4669]: E1008 20:46:14.330701 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:14 crc kubenswrapper[4669]: E1008 20:46:14.330877 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:15 crc kubenswrapper[4669]: I1008 20:46:15.330241 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:15 crc kubenswrapper[4669]: E1008 20:46:15.330355 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:16 crc kubenswrapper[4669]: I1008 20:46:16.330601 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:16 crc kubenswrapper[4669]: E1008 20:46:16.330722 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:16 crc kubenswrapper[4669]: I1008 20:46:16.330781 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:16 crc kubenswrapper[4669]: E1008 20:46:16.331143 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:16 crc kubenswrapper[4669]: I1008 20:46:16.331177 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:16 crc kubenswrapper[4669]: E1008 20:46:16.331329 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:17 crc kubenswrapper[4669]: I1008 20:46:17.330286 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:17 crc kubenswrapper[4669]: E1008 20:46:17.330441 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:18 crc kubenswrapper[4669]: I1008 20:46:18.330077 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:18 crc kubenswrapper[4669]: I1008 20:46:18.330131 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:18 crc kubenswrapper[4669]: I1008 20:46:18.330108 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:18 crc kubenswrapper[4669]: E1008 20:46:18.330266 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:18 crc kubenswrapper[4669]: E1008 20:46:18.330511 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:18 crc kubenswrapper[4669]: E1008 20:46:18.330741 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:19 crc kubenswrapper[4669]: I1008 20:46:19.330844 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:19 crc kubenswrapper[4669]: E1008 20:46:19.331056 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:20 crc kubenswrapper[4669]: I1008 20:46:20.330564 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:20 crc kubenswrapper[4669]: I1008 20:46:20.330667 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:20 crc kubenswrapper[4669]: I1008 20:46:20.330739 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:20 crc kubenswrapper[4669]: E1008 20:46:20.330728 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:20 crc kubenswrapper[4669]: E1008 20:46:20.332895 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:20 crc kubenswrapper[4669]: E1008 20:46:20.333344 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:21 crc kubenswrapper[4669]: I1008 20:46:21.330069 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:21 crc kubenswrapper[4669]: E1008 20:46:21.331438 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:21 crc kubenswrapper[4669]: I1008 20:46:21.714502 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:21 crc kubenswrapper[4669]: E1008 20:46:21.714769 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:46:21 crc kubenswrapper[4669]: E1008 20:46:21.714843 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs podName:f90eed21-8bc2-4723-b6be-a672669a36fb nodeName:}" failed. No retries permitted until 2025-10-08 20:47:25.714822994 +0000 UTC m=+165.407633657 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs") pod "network-metrics-daemon-ml9vv" (UID: "f90eed21-8bc2-4723-b6be-a672669a36fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 20:46:22 crc kubenswrapper[4669]: I1008 20:46:22.329740 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:22 crc kubenswrapper[4669]: I1008 20:46:22.329806 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:22 crc kubenswrapper[4669]: I1008 20:46:22.329858 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:22 crc kubenswrapper[4669]: E1008 20:46:22.329923 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:22 crc kubenswrapper[4669]: E1008 20:46:22.330158 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:22 crc kubenswrapper[4669]: E1008 20:46:22.330965 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:22 crc kubenswrapper[4669]: I1008 20:46:22.331570 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:46:22 crc kubenswrapper[4669]: E1008 20:46:22.331830 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:46:23 crc kubenswrapper[4669]: I1008 20:46:23.331102 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:23 crc kubenswrapper[4669]: E1008 20:46:23.331280 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:24 crc kubenswrapper[4669]: I1008 20:46:24.329778 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:24 crc kubenswrapper[4669]: I1008 20:46:24.329830 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:24 crc kubenswrapper[4669]: I1008 20:46:24.329793 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:24 crc kubenswrapper[4669]: E1008 20:46:24.329952 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:24 crc kubenswrapper[4669]: E1008 20:46:24.330033 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:24 crc kubenswrapper[4669]: E1008 20:46:24.330100 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:25 crc kubenswrapper[4669]: I1008 20:46:25.330769 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:25 crc kubenswrapper[4669]: E1008 20:46:25.331218 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:26 crc kubenswrapper[4669]: I1008 20:46:26.329864 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:26 crc kubenswrapper[4669]: I1008 20:46:26.329954 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:26 crc kubenswrapper[4669]: I1008 20:46:26.330028 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:26 crc kubenswrapper[4669]: E1008 20:46:26.330191 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:26 crc kubenswrapper[4669]: E1008 20:46:26.330376 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:26 crc kubenswrapper[4669]: E1008 20:46:26.330626 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:27 crc kubenswrapper[4669]: I1008 20:46:27.329790 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:27 crc kubenswrapper[4669]: E1008 20:46:27.330500 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:28 crc kubenswrapper[4669]: I1008 20:46:28.329864 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:28 crc kubenswrapper[4669]: I1008 20:46:28.329907 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:28 crc kubenswrapper[4669]: I1008 20:46:28.329957 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:28 crc kubenswrapper[4669]: E1008 20:46:28.330003 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:28 crc kubenswrapper[4669]: E1008 20:46:28.330131 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:28 crc kubenswrapper[4669]: E1008 20:46:28.330167 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:29 crc kubenswrapper[4669]: I1008 20:46:29.330035 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:29 crc kubenswrapper[4669]: E1008 20:46:29.330173 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:30 crc kubenswrapper[4669]: I1008 20:46:30.330127 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:30 crc kubenswrapper[4669]: I1008 20:46:30.330249 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:30 crc kubenswrapper[4669]: I1008 20:46:30.330151 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:30 crc kubenswrapper[4669]: E1008 20:46:30.330329 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:30 crc kubenswrapper[4669]: E1008 20:46:30.330460 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:30 crc kubenswrapper[4669]: E1008 20:46:30.330483 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:31 crc kubenswrapper[4669]: I1008 20:46:31.330800 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:31 crc kubenswrapper[4669]: E1008 20:46:31.333948 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:32 crc kubenswrapper[4669]: I1008 20:46:32.330289 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:32 crc kubenswrapper[4669]: I1008 20:46:32.330917 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:32 crc kubenswrapper[4669]: I1008 20:46:32.330769 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:32 crc kubenswrapper[4669]: E1008 20:46:32.331212 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:32 crc kubenswrapper[4669]: E1008 20:46:32.331332 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:32 crc kubenswrapper[4669]: E1008 20:46:32.331507 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:33 crc kubenswrapper[4669]: I1008 20:46:33.330679 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:33 crc kubenswrapper[4669]: E1008 20:46:33.331055 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:33 crc kubenswrapper[4669]: I1008 20:46:33.331972 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:46:33 crc kubenswrapper[4669]: E1008 20:46:33.332135 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpzdw_openshift-ovn-kubernetes(cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" Oct 08 20:46:34 crc kubenswrapper[4669]: I1008 20:46:34.329861 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:34 crc kubenswrapper[4669]: I1008 20:46:34.329876 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:34 crc kubenswrapper[4669]: I1008 20:46:34.329886 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:34 crc kubenswrapper[4669]: E1008 20:46:34.330409 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:34 crc kubenswrapper[4669]: E1008 20:46:34.330590 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:34 crc kubenswrapper[4669]: E1008 20:46:34.330706 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:35 crc kubenswrapper[4669]: I1008 20:46:35.330250 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:35 crc kubenswrapper[4669]: E1008 20:46:35.330446 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:36 crc kubenswrapper[4669]: I1008 20:46:36.329951 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:36 crc kubenswrapper[4669]: I1008 20:46:36.329983 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:36 crc kubenswrapper[4669]: I1008 20:46:36.330107 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:36 crc kubenswrapper[4669]: E1008 20:46:36.330785 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:36 crc kubenswrapper[4669]: E1008 20:46:36.331105 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:36 crc kubenswrapper[4669]: E1008 20:46:36.330578 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.330141 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:37 crc kubenswrapper[4669]: E1008 20:46:37.330756 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.927126 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/1.log" Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.928247 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/0.log" Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.928335 4669 generic.go:334] "Generic (PLEG): container finished" podID="2433400c-98f8-490f-a566-00a330a738fe" containerID="2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12" exitCode=1 Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.928383 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerDied","Data":"2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12"} Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.928429 4669 scope.go:117] "RemoveContainer" containerID="863b0630ebde7534e93ebf2952dab729566760278539e87efa4412389803c5ee" Oct 08 20:46:37 crc kubenswrapper[4669]: I1008 20:46:37.929012 4669 scope.go:117] "RemoveContainer" containerID="2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12" Oct 08 20:46:37 crc kubenswrapper[4669]: E1008 20:46:37.929370 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-klx9r_openshift-multus(2433400c-98f8-490f-a566-00a330a738fe)\"" pod="openshift-multus/multus-klx9r" podUID="2433400c-98f8-490f-a566-00a330a738fe" Oct 08 20:46:38 crc kubenswrapper[4669]: I1008 20:46:38.330390 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:38 crc kubenswrapper[4669]: I1008 20:46:38.330413 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:38 crc kubenswrapper[4669]: E1008 20:46:38.331126 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:38 crc kubenswrapper[4669]: I1008 20:46:38.330438 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:38 crc kubenswrapper[4669]: E1008 20:46:38.331232 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:38 crc kubenswrapper[4669]: E1008 20:46:38.331556 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:38 crc kubenswrapper[4669]: I1008 20:46:38.934129 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/1.log" Oct 08 20:46:39 crc kubenswrapper[4669]: I1008 20:46:39.330251 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:39 crc kubenswrapper[4669]: E1008 20:46:39.330428 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:40 crc kubenswrapper[4669]: I1008 20:46:40.330160 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:40 crc kubenswrapper[4669]: I1008 20:46:40.330168 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:40 crc kubenswrapper[4669]: E1008 20:46:40.330306 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:40 crc kubenswrapper[4669]: E1008 20:46:40.330416 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:40 crc kubenswrapper[4669]: I1008 20:46:40.330191 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:40 crc kubenswrapper[4669]: E1008 20:46:40.330602 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:41 crc kubenswrapper[4669]: E1008 20:46:41.279946 4669 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 08 20:46:41 crc kubenswrapper[4669]: I1008 20:46:41.331040 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:41 crc kubenswrapper[4669]: E1008 20:46:41.335201 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:41 crc kubenswrapper[4669]: E1008 20:46:41.429469 4669 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 20:46:42 crc kubenswrapper[4669]: I1008 20:46:42.330910 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:42 crc kubenswrapper[4669]: I1008 20:46:42.330993 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:42 crc kubenswrapper[4669]: I1008 20:46:42.331104 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:42 crc kubenswrapper[4669]: E1008 20:46:42.331662 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:42 crc kubenswrapper[4669]: E1008 20:46:42.331787 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:42 crc kubenswrapper[4669]: E1008 20:46:42.331959 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:43 crc kubenswrapper[4669]: I1008 20:46:43.330377 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:43 crc kubenswrapper[4669]: E1008 20:46:43.330569 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:44 crc kubenswrapper[4669]: I1008 20:46:44.330746 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:44 crc kubenswrapper[4669]: I1008 20:46:44.330736 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:44 crc kubenswrapper[4669]: E1008 20:46:44.330962 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:44 crc kubenswrapper[4669]: E1008 20:46:44.331121 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:44 crc kubenswrapper[4669]: I1008 20:46:44.330780 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:44 crc kubenswrapper[4669]: E1008 20:46:44.331331 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:45 crc kubenswrapper[4669]: I1008 20:46:45.330455 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:45 crc kubenswrapper[4669]: E1008 20:46:45.330700 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:46 crc kubenswrapper[4669]: I1008 20:46:46.330004 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:46 crc kubenswrapper[4669]: I1008 20:46:46.330058 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:46 crc kubenswrapper[4669]: I1008 20:46:46.330117 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:46 crc kubenswrapper[4669]: E1008 20:46:46.330149 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:46 crc kubenswrapper[4669]: E1008 20:46:46.330241 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:46 crc kubenswrapper[4669]: E1008 20:46:46.330342 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:46 crc kubenswrapper[4669]: E1008 20:46:46.432047 4669 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 20:46:47 crc kubenswrapper[4669]: I1008 20:46:47.330280 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:47 crc kubenswrapper[4669]: E1008 20:46:47.330429 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.330757 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.330844 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.330856 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:48 crc kubenswrapper[4669]: E1008 20:46:48.330971 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:48 crc kubenswrapper[4669]: E1008 20:46:48.331242 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.331332 4669 scope.go:117] "RemoveContainer" containerID="2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12" Oct 08 20:46:48 crc kubenswrapper[4669]: E1008 20:46:48.331345 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.332741 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.975687 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/1.log" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.975762 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerStarted","Data":"75f5b6d8d782c36aa2c69c94e49c4f5f2bcd8290971bfccd34c4de96d2fa34a3"} Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.978381 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/3.log" Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.981561 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerStarted","Data":"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f"} Oct 08 20:46:48 crc kubenswrapper[4669]: I1008 20:46:48.982046 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:46:49 crc kubenswrapper[4669]: I1008 20:46:49.033677 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podStartSLOduration=107.033654046 podStartE2EDuration="1m47.033654046s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:49.032894984 +0000 UTC m=+128.725705677" watchObservedRunningTime="2025-10-08 20:46:49.033654046 +0000 UTC m=+128.726464719" Oct 08 20:46:49 crc kubenswrapper[4669]: I1008 20:46:49.106651 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ml9vv"] Oct 08 20:46:49 crc kubenswrapper[4669]: I1008 20:46:49.106929 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:49 crc kubenswrapper[4669]: E1008 20:46:49.107140 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:50 crc kubenswrapper[4669]: I1008 20:46:50.330696 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:50 crc kubenswrapper[4669]: I1008 20:46:50.330803 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:50 crc kubenswrapper[4669]: I1008 20:46:50.330973 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:50 crc kubenswrapper[4669]: E1008 20:46:50.330960 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ml9vv" podUID="f90eed21-8bc2-4723-b6be-a672669a36fb" Oct 08 20:46:50 crc kubenswrapper[4669]: I1008 20:46:50.331015 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:50 crc kubenswrapper[4669]: E1008 20:46:50.331131 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 20:46:50 crc kubenswrapper[4669]: E1008 20:46:50.331295 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 20:46:50 crc kubenswrapper[4669]: E1008 20:46:50.331408 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.772543 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.815822 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.816813 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.817739 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rtm6r"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.818556 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.820007 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtqw9"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.820613 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.824030 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.824544 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.824701 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.824935 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.825081 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.825345 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.825516 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.825711 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.825906 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.826071 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.826220 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.826486 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.826610 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.826727 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.827037 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.827213 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.827783 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zh2xc"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.828219 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.828358 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.829562 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.829819 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.829974 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.832677 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4m5k5"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.833269 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.834275 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.834299 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.837081 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-74lcm"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.838577 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.839076 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-z99h8"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.839915 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.845371 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.848404 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.848415 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.848663 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.848727 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.849068 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.849226 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.850791 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.851206 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ps5hq"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.851545 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.851686 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.853340 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.853546 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.853695 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.853972 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.854180 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.854388 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.854664 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.854922 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.855269 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.855427 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.855444 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.855558 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.855805 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.856014 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.856242 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.856430 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.856450 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.856703 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.856916 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.857114 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.857313 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.857401 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.857622 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.857880 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.858878 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.859067 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.861826 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.862172 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.871107 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-25b8x"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.874984 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.875547 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.877560 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.877632 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.880099 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6l6tw"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.893391 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.894282 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.894643 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.895477 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.895629 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtqw9"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.895678 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.895828 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.895842 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.896663 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.897224 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.897292 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.897303 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.897244 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.897622 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.898456 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.900400 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.900672 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.900880 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.900981 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.901095 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm28n"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.901740 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.901994 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.902092 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.902470 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.904057 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.904269 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.904366 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.904383 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.904298 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.904466 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rtm6r"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.905120 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.906279 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4nbtf"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.906868 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.907465 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-db8rh"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.907629 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.907807 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.907932 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.908000 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.908133 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.908649 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.908848 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.910236 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.911880 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.912184 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.912388 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.912673 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.913489 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.914670 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.915567 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.917269 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.917707 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.918127 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.935413 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.936212 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.937095 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.937339 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.937741 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.939067 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.939644 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.939709 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.939902 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.939951 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.946362 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.954856 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.956154 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.956296 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.956535 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.958117 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.959009 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.966376 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.967461 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.968513 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.971886 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4knb8"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.970401 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.974410 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01878aca-a23f-4add-bedf-94cc3b75b481-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976002 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st97g\" (UniqueName: \"kubernetes.io/projected/2f8ab351-ab50-4289-8d0f-aae3ade74644-kube-api-access-st97g\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976032 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976054 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0dc8a08-0e25-4dba-9e96-089c46deb679-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r2ncb\" (UID: \"b0dc8a08-0e25-4dba-9e96-089c46deb679\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976077 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3434800b-823a-4050-863e-1d2240a29709-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976101 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533e765f-918a-456e-8803-dd3e8495f20f-config\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976119 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-etcd-client\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976136 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-image-import-ca\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976154 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1c512c-76d2-4775-980b-47205a8e55e2-config\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976175 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976195 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8b346cb-e447-4473-a94d-a66882c3af6f-node-pullsecrets\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976220 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwmct\" (UniqueName: \"kubernetes.io/projected/01878aca-a23f-4add-bedf-94cc3b75b481-kube-api-access-fwmct\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976241 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-oauth-config\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976261 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/533e765f-918a-456e-8803-dd3e8495f20f-machine-approver-tls\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976287 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976307 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976325 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzll\" (UniqueName: \"kubernetes.io/projected/e8b346cb-e447-4473-a94d-a66882c3af6f-kube-api-access-wxzll\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976332 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976349 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976369 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6625254f-fe58-44a5-b467-9ed7544a7902-config\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976474 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-client-ca\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976499 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-config\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976523 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-service-ca\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976573 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6625254f-fe58-44a5-b467-9ed7544a7902-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976597 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3434800b-823a-4050-863e-1d2240a29709-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976618 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8krmk\" (UniqueName: \"kubernetes.io/projected/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-kube-api-access-8krmk\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976654 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-serving-cert\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976676 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976696 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976714 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tqhcx"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.977262 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.977458 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.977752 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.976716 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhkjw\" (UniqueName: \"kubernetes.io/projected/5289f930-ba47-4745-8ab7-784863dc110e-kube-api-access-bhkjw\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.977938 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3434800b-823a-4050-863e-1d2240a29709-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.977975 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-oauth-serving-cert\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978008 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nd8\" (UniqueName: \"kubernetes.io/projected/3434800b-823a-4050-863e-1d2240a29709-kube-api-access-d7nd8\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978041 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978051 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978084 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8b346cb-e447-4473-a94d-a66882c3af6f-audit-dir\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978108 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d1c512c-76d2-4775-980b-47205a8e55e2-serving-cert\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978128 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d1c512c-76d2-4775-980b-47205a8e55e2-trusted-ca\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978175 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pw6\" (UniqueName: \"kubernetes.io/projected/b0dc8a08-0e25-4dba-9e96-089c46deb679-kube-api-access-v7pw6\") pod \"cluster-samples-operator-665b6dd947-r2ncb\" (UID: \"b0dc8a08-0e25-4dba-9e96-089c46deb679\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978197 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-serving-cert\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978225 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f8ab351-ab50-4289-8d0f-aae3ade74644-serving-cert\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978251 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/533e765f-918a-456e-8803-dd3e8495f20f-auth-proxy-config\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978273 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978289 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978293 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-config\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978314 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-etcd-serving-ca\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978337 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldsmv\" (UniqueName: \"kubernetes.io/projected/3b104989-912e-43b1-a295-2ea8eb157e77-kube-api-access-ldsmv\") pod \"downloads-7954f5f757-z99h8\" (UID: \"3b104989-912e-43b1-a295-2ea8eb157e77\") " pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978371 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5289f930-ba47-4745-8ab7-784863dc110e-audit-dir\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978396 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-audit-policies\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978417 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-encryption-config\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978441 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-config\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978479 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-audit\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978514 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978560 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pxgj\" (UniqueName: \"kubernetes.io/projected/1d1c512c-76d2-4775-980b-47205a8e55e2-kube-api-access-8pxgj\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978584 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978666 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6625254f-fe58-44a5-b467-9ed7544a7902-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978697 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-trusted-ca-bundle\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978719 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8rt7\" (UniqueName: \"kubernetes.io/projected/533e765f-918a-456e-8803-dd3e8495f20f-kube-api-access-k8rt7\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.978752 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01878aca-a23f-4add-bedf-94cc3b75b481-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.981846 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.982011 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.982470 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.982836 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.983059 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.985609 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.986126 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.986305 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.986832 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.986915 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.989911 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wg8jg"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.990354 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.990878 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xndc2"] Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.992940 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:51 crc kubenswrapper[4669]: I1008 20:46:51.997631 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.000239 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dtnh6"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.000592 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.000616 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.000690 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.000803 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.001095 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.001427 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.001831 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.012211 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.014467 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.014653 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z99h8"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.017179 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ps5hq"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.020048 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6l6tw"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.020514 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zh2xc"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.021677 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.022628 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.026993 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.035041 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.036945 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.038876 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.041042 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-74lcm"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.042149 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.043446 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.044576 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.045991 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.047165 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.048249 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tqhcx"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.049512 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4m5k5"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.051191 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm28n"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.052942 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wg8jg"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.054034 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4knb8"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.055469 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.056543 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.058054 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-25b8x"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.058690 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.059075 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.060432 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.061437 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.062499 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4nbtf"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.063621 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kc49q"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.064650 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.064815 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.067835 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dtnh6"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.069110 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.070981 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.072010 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xndc2"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.073027 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sgf44"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.073811 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.074170 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kc49q"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.075166 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.076662 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sgf44"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.077569 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.078866 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.078906 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9fcxn"] Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079503 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079547 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0dc8a08-0e25-4dba-9e96-089c46deb679-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r2ncb\" (UID: \"b0dc8a08-0e25-4dba-9e96-089c46deb679\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079573 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st97g\" (UniqueName: \"kubernetes.io/projected/2f8ab351-ab50-4289-8d0f-aae3ade74644-kube-api-access-st97g\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079581 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079596 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3434800b-823a-4050-863e-1d2240a29709-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079621 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533e765f-918a-456e-8803-dd3e8495f20f-config\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079644 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-etcd-client\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-image-import-ca\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079681 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1c512c-76d2-4775-980b-47205a8e55e2-config\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079698 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079720 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8b346cb-e447-4473-a94d-a66882c3af6f-node-pullsecrets\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079741 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwmct\" (UniqueName: \"kubernetes.io/projected/01878aca-a23f-4add-bedf-94cc3b75b481-kube-api-access-fwmct\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079760 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-oauth-config\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079778 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/533e765f-918a-456e-8803-dd3e8495f20f-machine-approver-tls\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079795 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079811 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079834 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzll\" (UniqueName: \"kubernetes.io/projected/e8b346cb-e447-4473-a94d-a66882c3af6f-kube-api-access-wxzll\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079857 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079872 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6625254f-fe58-44a5-b467-9ed7544a7902-config\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079888 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-client-ca\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079903 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-config\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079921 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-service-ca\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079937 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6625254f-fe58-44a5-b467-9ed7544a7902-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079952 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3434800b-823a-4050-863e-1d2240a29709-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079968 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8krmk\" (UniqueName: \"kubernetes.io/projected/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-kube-api-access-8krmk\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.079997 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080014 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080028 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhkjw\" (UniqueName: \"kubernetes.io/projected/5289f930-ba47-4745-8ab7-784863dc110e-kube-api-access-bhkjw\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080043 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3434800b-823a-4050-863e-1d2240a29709-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080058 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-serving-cert\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080076 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-oauth-serving-cert\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080093 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nd8\" (UniqueName: \"kubernetes.io/projected/3434800b-823a-4050-863e-1d2240a29709-kube-api-access-d7nd8\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080153 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080173 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8b346cb-e447-4473-a94d-a66882c3af6f-audit-dir\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080213 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d1c512c-76d2-4775-980b-47205a8e55e2-trusted-ca\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080229 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080246 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pw6\" (UniqueName: \"kubernetes.io/projected/b0dc8a08-0e25-4dba-9e96-089c46deb679-kube-api-access-v7pw6\") pod \"cluster-samples-operator-665b6dd947-r2ncb\" (UID: \"b0dc8a08-0e25-4dba-9e96-089c46deb679\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080263 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-serving-cert\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d1c512c-76d2-4775-980b-47205a8e55e2-serving-cert\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080296 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f8ab351-ab50-4289-8d0f-aae3ade74644-serving-cert\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/533e765f-918a-456e-8803-dd3e8495f20f-auth-proxy-config\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080355 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-config\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080377 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-etcd-serving-ca\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080402 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldsmv\" (UniqueName: \"kubernetes.io/projected/3b104989-912e-43b1-a295-2ea8eb157e77-kube-api-access-ldsmv\") pod \"downloads-7954f5f757-z99h8\" (UID: \"3b104989-912e-43b1-a295-2ea8eb157e77\") " pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080471 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5289f930-ba47-4745-8ab7-784863dc110e-audit-dir\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080488 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-audit-policies\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080503 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-encryption-config\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080541 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-config\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080566 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-audit\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080596 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080642 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080660 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pxgj\" (UniqueName: \"kubernetes.io/projected/1d1c512c-76d2-4775-980b-47205a8e55e2-kube-api-access-8pxgj\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080677 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6625254f-fe58-44a5-b467-9ed7544a7902-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080695 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8rt7\" (UniqueName: \"kubernetes.io/projected/533e765f-918a-456e-8803-dd3e8495f20f-kube-api-access-k8rt7\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080718 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01878aca-a23f-4add-bedf-94cc3b75b481-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080738 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-trusted-ca-bundle\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080759 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01878aca-a23f-4add-bedf-94cc3b75b481-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080867 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3434800b-823a-4050-863e-1d2240a29709-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.080898 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-image-import-ca\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081146 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8b346cb-e447-4473-a94d-a66882c3af6f-audit-dir\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081297 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6625254f-fe58-44a5-b467-9ed7544a7902-config\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081323 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533e765f-918a-456e-8803-dd3e8495f20f-config\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081464 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081643 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01878aca-a23f-4add-bedf-94cc3b75b481-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081892 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1c512c-76d2-4775-980b-47205a8e55e2-config\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.081966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.082309 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-oauth-serving-cert\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.082459 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-client-ca\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.082558 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8b346cb-e447-4473-a94d-a66882c3af6f-node-pullsecrets\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.082868 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d1c512c-76d2-4775-980b-47205a8e55e2-trusted-ca\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.085395 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-etcd-client\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.085406 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.085697 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6625254f-fe58-44a5-b467-9ed7544a7902-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.085723 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-serving-cert\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.085806 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.086356 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.086435 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-service-ca\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.086442 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0dc8a08-0e25-4dba-9e96-089c46deb679-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r2ncb\" (UID: \"b0dc8a08-0e25-4dba-9e96-089c46deb679\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.086733 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.086811 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5289f930-ba47-4745-8ab7-784863dc110e-audit-dir\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.086865 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-config\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.087599 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.087854 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.087872 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-audit-policies\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.088071 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-audit\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.088428 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.088589 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.088885 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-trusted-ca-bundle\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.089328 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-serving-cert\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.089858 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-oauth-config\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090088 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-config\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090377 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8b346cb-e447-4473-a94d-a66882c3af6f-encryption-config\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090543 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090786 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/533e765f-918a-456e-8803-dd3e8495f20f-auth-proxy-config\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090819 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01878aca-a23f-4add-bedf-94cc3b75b481-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.090886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3434800b-823a-4050-863e-1d2240a29709-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.091028 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f8ab351-ab50-4289-8d0f-aae3ade74644-serving-cert\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.091269 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/533e765f-918a-456e-8803-dd3e8495f20f-machine-approver-tls\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.091325 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8b346cb-e447-4473-a94d-a66882c3af6f-etcd-serving-ca\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.091570 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-config\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.091974 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d1c512c-76d2-4775-980b-47205a8e55e2-serving-cert\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.099285 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.134211 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.138082 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.159047 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.179501 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.198888 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.219420 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.238960 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.258791 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.278859 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.299113 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.319822 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.329848 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.329849 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.329924 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.330658 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.338817 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.358801 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.379291 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.399483 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.419618 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.439812 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.459671 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.479406 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.499872 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.519169 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.539659 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.559695 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.578708 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.619568 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.639570 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.658818 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.680836 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.699296 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.719864 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.740127 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.766691 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.779366 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.800433 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.819960 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.840038 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.860502 4669 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.879900 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.899568 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.920397 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.940356 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.960629 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.979156 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 08 20:46:52 crc kubenswrapper[4669]: I1008 20:46:52.997654 4669 request.go:700] Waited for 1.01253641s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.000751 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.020458 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.039031 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.059431 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.080145 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.099494 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.119608 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.139814 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.159069 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.180220 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.199261 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.219770 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.239274 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.259619 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.280163 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.300026 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.320366 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.339609 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.360238 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.379856 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.399514 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.420026 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.439635 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.459333 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.479578 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.498945 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.519740 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.540360 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.562063 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.579684 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.600909 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.620107 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.639740 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.666412 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.679314 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.703807 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-config\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.703884 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4bvp\" (UniqueName: \"kubernetes.io/projected/efd51828-51ba-4723-bd28-306e48ce1a54-kube-api-access-k4bvp\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.703943 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.703980 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-tls\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704061 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704134 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-certificates\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704274 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-trusted-ca\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704392 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-service-ca-bundle\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704461 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-bound-sa-token\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704565 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64c20dfc-09aa-4096-b7c1-7233d0a18a17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704614 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b21de9b-333f-49d6-84b7-a616e217a26d-serving-cert\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704638 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efd51828-51ba-4723-bd28-306e48ce1a54-audit-dir\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704672 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-etcd-client\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704695 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-encryption-config\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: E1008 20:46:53.704754 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.204729003 +0000 UTC m=+133.897539696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704802 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-serving-cert\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.704907 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64c20dfc-09aa-4096-b7c1-7233d0a18a17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.705184 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.705240 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkn2p\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-kube-api-access-rkn2p\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.705303 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-audit-policies\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.705342 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.705409 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqr97\" (UniqueName: \"kubernetes.io/projected/3b21de9b-333f-49d6-84b7-a616e217a26d-kube-api-access-fqr97\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.720015 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.739390 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.759693 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.779287 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.799513 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.806268 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:53 crc kubenswrapper[4669]: E1008 20:46:53.806403 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.30637766 +0000 UTC m=+133.999188373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807232 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-images\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807296 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2628b9d-5e3f-412f-b3c7-f24e6a208577-config-volume\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807330 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ab0571a-8beb-446a-9f96-b9bc226d5b22-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807408 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-config\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-config\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807641 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0858a203-42e8-4108-a0bd-48ba190bf420-serving-cert\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.807774 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/672eeff6-8079-4f9a-a61c-40094de694be-service-ca-bundle\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.808272 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-certificates\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.808593 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.808804 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-bound-sa-token\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.808979 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b77bb83-431f-4431-93cc-04be87ab96dc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.809274 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a558b91-227c-4e11-aa99-4406e545a2ea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.809465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-config\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.809615 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21147ccd-e3af-44e0-a7fd-2931e731cc53-auth-proxy-config\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.809795 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64c20dfc-09aa-4096-b7c1-7233d0a18a17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.809909 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/96a22bbb-2450-42e2-8c27-8246a89fbb6e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.810096 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.810265 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8xnc\" (UniqueName: \"kubernetes.io/projected/21147ccd-e3af-44e0-a7fd-2931e731cc53-kube-api-access-t8xnc\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.810982 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efd51828-51ba-4723-bd28-306e48ce1a54-audit-dir\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.811120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-etcd-client\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.811167 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.811312 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-certificates\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.811344 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-serving-cert\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.811661 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efd51828-51ba-4723-bd28-306e48ce1a54-audit-dir\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.812404 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64c20dfc-09aa-4096-b7c1-7233d0a18a17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.812603 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96a22bbb-2450-42e2-8c27-8246a89fbb6e-serving-cert\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.812656 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzxn\" (UniqueName: \"kubernetes.io/projected/d2628b9d-5e3f-412f-b3c7-f24e6a208577-kube-api-access-7wzxn\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.812855 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a558b91-227c-4e11-aa99-4406e545a2ea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.813036 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.813151 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/21147ccd-e3af-44e0-a7fd-2931e731cc53-images\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.813313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqr97\" (UniqueName: \"kubernetes.io/projected/3b21de9b-333f-49d6-84b7-a616e217a26d-kube-api-access-fqr97\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.813380 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64c20dfc-09aa-4096-b7c1-7233d0a18a17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.813650 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6254786-5db9-423f-88ed-f42edefc70a8-metrics-tls\") pod \"dns-operator-744455d44c-xndc2\" (UID: \"c6254786-5db9-423f-88ed-f42edefc70a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814307 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-config\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814489 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6zm\" (UniqueName: \"kubernetes.io/projected/672eeff6-8079-4f9a-a61c-40094de694be-kube-api-access-6m6zm\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814609 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blrlq\" (UniqueName: \"kubernetes.io/projected/9b77bb83-431f-4431-93cc-04be87ab96dc-kube-api-access-blrlq\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814693 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8tjh\" (UniqueName: \"kubernetes.io/projected/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-kube-api-access-s8tjh\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814742 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s292x\" (UniqueName: \"kubernetes.io/projected/c6254786-5db9-423f-88ed-f42edefc70a8-kube-api-access-s292x\") pod \"dns-operator-744455d44c-xndc2\" (UID: \"c6254786-5db9-423f-88ed-f42edefc70a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814945 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.814938 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsstc\" (UniqueName: \"kubernetes.io/projected/7a558b91-227c-4e11-aa99-4406e545a2ea-kube-api-access-lsstc\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815087 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2628b9d-5e3f-412f-b3c7-f24e6a208577-metrics-tls\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815205 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b77bb83-431f-4431-93cc-04be87ab96dc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815506 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21147ccd-e3af-44e0-a7fd-2931e731cc53-proxy-tls\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815658 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4bvp\" (UniqueName: \"kubernetes.io/projected/efd51828-51ba-4723-bd28-306e48ce1a54-kube-api-access-k4bvp\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815787 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815878 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-tls\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.815961 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816170 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-trusted-ca\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816265 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab0571a-8beb-446a-9f96-b9bc226d5b22-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816347 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-service-ca-bundle\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816423 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tqq\" (UniqueName: \"kubernetes.io/projected/96a22bbb-2450-42e2-8c27-8246a89fbb6e-kube-api-access-q9tqq\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816488 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-client-ca\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816633 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-stats-auth\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: E1008 20:46:53.816703 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.316664439 +0000 UTC m=+134.009475152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.816799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b21de9b-333f-49d6-84b7-a616e217a26d-serving-cert\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817386 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-etcd-client\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817502 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2pxl\" (UniqueName: \"kubernetes.io/projected/0858a203-42e8-4108-a0bd-48ba190bf420-kube-api-access-z2pxl\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817611 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-encryption-config\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817705 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ab0571a-8beb-446a-9f96-b9bc226d5b22-config\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817770 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817847 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817891 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkn2p\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-kube-api-access-rkn2p\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817923 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-audit-policies\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.817996 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-default-certificate\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.818035 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-metrics-certs\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.818214 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64c20dfc-09aa-4096-b7c1-7233d0a18a17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.821392 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.821462 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.822622 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efd51828-51ba-4723-bd28-306e48ce1a54-audit-policies\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.823751 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-service-ca-bundle\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.824199 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-serving-cert\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.827242 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21de9b-333f-49d6-84b7-a616e217a26d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.827616 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-tls\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.828609 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b21de9b-333f-49d6-84b7-a616e217a26d-serving-cert\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.829160 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/efd51828-51ba-4723-bd28-306e48ce1a54-encryption-config\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.840086 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.842046 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-trusted-ca\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.880315 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.888203 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st97g\" (UniqueName: \"kubernetes.io/projected/2f8ab351-ab50-4289-8d0f-aae3ade74644-kube-api-access-st97g\") pod \"route-controller-manager-6576b87f9c-4xg9t\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.900847 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 08 20:46:53 crc kubenswrapper[4669]: E1008 20:46:53.919357 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.419329079 +0000 UTC m=+134.112139782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.919201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.919708 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-client-ca\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.920686 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-stats-auth\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.920884 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bbf025a-0f57-480a-80ec-4e21b1ef4e69-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4485l\" (UID: \"4bbf025a-0f57-480a-80ec-4e21b1ef4e69\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921326 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-secret-volume\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921522 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921608 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2fa8fe7-d057-4386-9121-f4c838bd1a76-proxy-tls\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921659 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-ca\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921720 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2pxl\" (UniqueName: \"kubernetes.io/projected/0858a203-42e8-4108-a0bd-48ba190bf420-kube-api-access-z2pxl\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921787 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-client\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921866 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-config-volume\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921916 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lf5\" (UniqueName: \"kubernetes.io/projected/fabcc99d-46ca-4172-918e-b5038c4a001a-kube-api-access-92lf5\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.921976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/318f04b7-d7a9-4e8f-9d0f-03699e258019-cert\") pod \"ingress-canary-kc49q\" (UID: \"318f04b7-d7a9-4e8f-9d0f-03699e258019\") " pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.922046 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ab0571a-8beb-446a-9f96-b9bc226d5b22-config\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.922565 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-client-ca\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924590 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ab0571a-8beb-446a-9f96-b9bc226d5b22-config\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924685 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghwc\" (UniqueName: \"kubernetes.io/projected/d0da4600-abf4-4f3e-8299-a269b29ca44a-kube-api-access-sghwc\") pod \"control-plane-machine-set-operator-78cbb6b69f-9vmlr\" (UID: \"d0da4600-abf4-4f3e-8299-a269b29ca44a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924761 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924811 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hcsl\" (UniqueName: \"kubernetes.io/projected/4bbf025a-0f57-480a-80ec-4e21b1ef4e69-kube-api-access-6hcsl\") pod \"package-server-manager-789f6589d5-4485l\" (UID: \"4bbf025a-0f57-480a-80ec-4e21b1ef4e69\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924855 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-socket-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924899 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdtr\" (UniqueName: \"kubernetes.io/projected/9f1ba963-c886-46de-9428-a18cc4b13eeb-kube-api-access-xrdtr\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.924967 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bcf6b7f-1936-4261-a1ee-907951cb68f3-srv-cert\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925016 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925097 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c26994-b813-4c6e-af79-4c35425bc260-config\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925146 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-default-certificate\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925192 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9bcf6b7f-1936-4261-a1ee-907951cb68f3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925242 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c26994-b813-4c6e-af79-4c35425bc260-serving-cert\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925289 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jbs5\" (UniqueName: \"kubernetes.io/projected/b5e11824-a189-4e46-9548-641ef4dbe7fe-kube-api-access-9jbs5\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925339 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0da4600-abf4-4f3e-8299-a269b29ca44a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9vmlr\" (UID: \"d0da4600-abf4-4f3e-8299-a269b29ca44a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925392 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-metrics-certs\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925444 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b5e11824-a189-4e46-9548-641ef4dbe7fe-signing-key\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925496 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-images\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925503 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925586 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2628b9d-5e3f-412f-b3c7-f24e6a208577-config-volume\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925646 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ab0571a-8beb-446a-9f96-b9bc226d5b22-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925697 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-csi-data-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925746 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8374c82a-8339-42ff-b216-9ffc87a4d710-serving-cert\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925809 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d62a0f2d-49ef-4291-a14d-940a6215347c-apiservice-cert\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925859 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d62a0f2d-49ef-4291-a14d-940a6215347c-webhook-cert\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925934 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-config\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.925983 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0858a203-42e8-4108-a0bd-48ba190bf420-serving-cert\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fabcc99d-46ca-4172-918e-b5038c4a001a-certs\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926090 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fabcc99d-46ca-4172-918e-b5038c4a001a-node-bootstrap-token\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926132 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2hq\" (UniqueName: \"kubernetes.io/projected/5dd3b6ce-7492-4273-8d1b-86dc10a03404-kube-api-access-7r2hq\") pod \"migrator-59844c95c7-r7zkf\" (UID: \"5dd3b6ce-7492-4273-8d1b-86dc10a03404\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926180 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7cd4\" (UniqueName: \"kubernetes.io/projected/d62a0f2d-49ef-4291-a14d-940a6215347c-kube-api-access-d7cd4\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926223 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b5e11824-a189-4e46-9548-641ef4dbe7fe-signing-cabundle\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926301 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/672eeff6-8079-4f9a-a61c-40094de694be-service-ca-bundle\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926409 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsk6x\" (UniqueName: \"kubernetes.io/projected/dca95b5e-6308-41f2-a01d-9cb5c15b8607-kube-api-access-fsk6x\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926466 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7cv\" (UniqueName: \"kubernetes.io/projected/f80324fe-a4f1-44ba-a946-acade1dcb4e0-kube-api-access-tw7cv\") pod \"multus-admission-controller-857f4d67dd-dtnh6\" (UID: \"f80324fe-a4f1-44ba-a946-acade1dcb4e0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926593 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b77bb83-431f-4431-93cc-04be87ab96dc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926647 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926699 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f1ba963-c886-46de-9428-a18cc4b13eeb-metrics-tls\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926756 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f1ba963-c886-46de-9428-a18cc4b13eeb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926803 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-config\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926855 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a558b91-227c-4e11-aa99-4406e545a2ea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.926945 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8f55\" (UniqueName: \"kubernetes.io/projected/6364f6d6-6f56-43a4-af6c-9417865f326e-kube-api-access-p8f55\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927001 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21147ccd-e3af-44e0-a7fd-2931e731cc53-auth-proxy-config\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927052 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/96a22bbb-2450-42e2-8c27-8246a89fbb6e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927101 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927148 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8xnc\" (UniqueName: \"kubernetes.io/projected/21147ccd-e3af-44e0-a7fd-2931e731cc53-kube-api-access-t8xnc\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927199 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dca95b5e-6308-41f2-a01d-9cb5c15b8607-profile-collector-cert\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927278 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927328 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f80324fe-a4f1-44ba-a946-acade1dcb4e0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dtnh6\" (UID: \"f80324fe-a4f1-44ba-a946-acade1dcb4e0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927390 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk62\" (UniqueName: \"kubernetes.io/projected/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-kube-api-access-spk62\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2fa8fe7-d057-4386-9121-f4c838bd1a76-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927489 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927605 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96a22bbb-2450-42e2-8c27-8246a89fbb6e-serving-cert\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927659 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzxn\" (UniqueName: \"kubernetes.io/projected/d2628b9d-5e3f-412f-b3c7-f24e6a208577-kube-api-access-7wzxn\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-stats-auth\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927724 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a558b91-227c-4e11-aa99-4406e545a2ea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927784 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh69r\" (UniqueName: \"kubernetes.io/projected/ce159705-9661-4510-a5b8-9e7ac58e524c-kube-api-access-fh69r\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927843 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/21147ccd-e3af-44e0-a7fd-2931e731cc53-images\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927914 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6254786-5db9-423f-88ed-f42edefc70a8-metrics-tls\") pod \"dns-operator-744455d44c-xndc2\" (UID: \"c6254786-5db9-423f-88ed-f42edefc70a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.927969 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-service-ca\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928022 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-config\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928070 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6zm\" (UniqueName: \"kubernetes.io/projected/672eeff6-8079-4f9a-a61c-40094de694be-kube-api-access-6m6zm\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blrlq\" (UniqueName: \"kubernetes.io/projected/9b77bb83-431f-4431-93cc-04be87ab96dc-kube-api-access-blrlq\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928170 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s292x\" (UniqueName: \"kubernetes.io/projected/c6254786-5db9-423f-88ed-f42edefc70a8-kube-api-access-s292x\") pod \"dns-operator-744455d44c-xndc2\" (UID: \"c6254786-5db9-423f-88ed-f42edefc70a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928221 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-registration-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928289 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8tjh\" (UniqueName: \"kubernetes.io/projected/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-kube-api-access-s8tjh\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928341 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t224m\" (UniqueName: \"kubernetes.io/projected/8374c82a-8339-42ff-b216-9ffc87a4d710-kube-api-access-t224m\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928398 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsstc\" (UniqueName: \"kubernetes.io/projected/7a558b91-227c-4e11-aa99-4406e545a2ea-kube-api-access-lsstc\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928445 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2628b9d-5e3f-412f-b3c7-f24e6a208577-metrics-tls\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928494 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wxt\" (UniqueName: \"kubernetes.io/projected/e2fa8fe7-d057-4386-9121-f4c838bd1a76-kube-api-access-65wxt\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928589 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b77bb83-431f-4431-93cc-04be87ab96dc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928641 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21147ccd-e3af-44e0-a7fd-2931e731cc53-proxy-tls\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928696 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5pc\" (UniqueName: \"kubernetes.io/projected/9bcf6b7f-1936-4261-a1ee-907951cb68f3-kube-api-access-fc5pc\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928767 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f1ba963-c886-46de-9428-a18cc4b13eeb-trusted-ca\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928820 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmck4\" (UniqueName: \"kubernetes.io/projected/318f04b7-d7a9-4e8f-9d0f-03699e258019-kube-api-access-bmck4\") pod \"ingress-canary-kc49q\" (UID: \"318f04b7-d7a9-4e8f-9d0f-03699e258019\") " pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928827 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.928995 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929059 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghbg9\" (UniqueName: \"kubernetes.io/projected/86c26994-b813-4c6e-af79-4c35425bc260-kube-api-access-ghbg9\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929143 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab0571a-8beb-446a-9f96-b9bc226d5b22-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929191 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-mountpoint-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929235 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dca95b5e-6308-41f2-a01d-9cb5c15b8607-srv-cert\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929283 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d62a0f2d-49ef-4291-a14d-940a6215347c-tmpfs\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929338 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tqq\" (UniqueName: \"kubernetes.io/projected/96a22bbb-2450-42e2-8c27-8246a89fbb6e-kube-api-access-q9tqq\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929386 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-plugins-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.929429 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-default-certificate\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.932123 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96a22bbb-2450-42e2-8c27-8246a89fbb6e-serving-cert\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.933131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21147ccd-e3af-44e0-a7fd-2931e731cc53-auth-proxy-config\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.933615 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/96a22bbb-2450-42e2-8c27-8246a89fbb6e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.934204 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-config\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.934703 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.935326 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/672eeff6-8079-4f9a-a61c-40094de694be-service-ca-bundle\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.936278 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/672eeff6-8079-4f9a-a61c-40094de694be-metrics-certs\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.936851 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a558b91-227c-4e11-aa99-4406e545a2ea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.936863 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b77bb83-431f-4431-93cc-04be87ab96dc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.937445 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-images\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.937911 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/21147ccd-e3af-44e0-a7fd-2931e731cc53-images\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: E1008 20:46:53.939190 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.439169122 +0000 UTC m=+134.131979815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.940764 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-config\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.941626 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2628b9d-5e3f-412f-b3c7-f24e6a208577-config-volume\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.941934 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.942365 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b77bb83-431f-4431-93cc-04be87ab96dc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.948106 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6254786-5db9-423f-88ed-f42edefc70a8-metrics-tls\") pod \"dns-operator-744455d44c-xndc2\" (UID: \"c6254786-5db9-423f-88ed-f42edefc70a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.948331 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21147ccd-e3af-44e0-a7fd-2931e731cc53-proxy-tls\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.948772 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0858a203-42e8-4108-a0bd-48ba190bf420-serving-cert\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.948941 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2628b9d-5e3f-412f-b3c7-f24e6a208577-metrics-tls\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.950081 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab0571a-8beb-446a-9f96-b9bc226d5b22-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.950232 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwmct\" (UniqueName: \"kubernetes.io/projected/01878aca-a23f-4add-bedf-94cc3b75b481-kube-api-access-fwmct\") pod \"openshift-apiserver-operator-796bbdcf4f-4rzvg\" (UID: \"01878aca-a23f-4add-bedf-94cc3b75b481\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.951587 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a558b91-227c-4e11-aa99-4406e545a2ea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.968387 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pw6\" (UniqueName: \"kubernetes.io/projected/b0dc8a08-0e25-4dba-9e96-089c46deb679-kube-api-access-v7pw6\") pod \"cluster-samples-operator-665b6dd947-r2ncb\" (UID: \"b0dc8a08-0e25-4dba-9e96-089c46deb679\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.980886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhkjw\" (UniqueName: \"kubernetes.io/projected/5289f930-ba47-4745-8ab7-784863dc110e-kube-api-access-bhkjw\") pod \"oauth-openshift-558db77b4-6l6tw\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.995047 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3434800b-823a-4050-863e-1d2240a29709-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:53 crc kubenswrapper[4669]: I1008 20:46:53.997750 4669 request.go:700] Waited for 1.916301931s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.018800 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzll\" (UniqueName: \"kubernetes.io/projected/e8b346cb-e447-4473-a94d-a66882c3af6f-kube-api-access-wxzll\") pod \"apiserver-76f77b778f-25b8x\" (UID: \"e8b346cb-e447-4473-a94d-a66882c3af6f\") " pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030032 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030218 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-service-ca\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.030303 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.530269102 +0000 UTC m=+134.223079785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030396 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-registration-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030432 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t224m\" (UniqueName: \"kubernetes.io/projected/8374c82a-8339-42ff-b216-9ffc87a4d710-kube-api-access-t224m\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030459 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wxt\" (UniqueName: \"kubernetes.io/projected/e2fa8fe7-d057-4386-9121-f4c838bd1a76-kube-api-access-65wxt\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030491 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5pc\" (UniqueName: \"kubernetes.io/projected/9bcf6b7f-1936-4261-a1ee-907951cb68f3-kube-api-access-fc5pc\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030524 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f1ba963-c886-46de-9428-a18cc4b13eeb-trusted-ca\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030580 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmck4\" (UniqueName: \"kubernetes.io/projected/318f04b7-d7a9-4e8f-9d0f-03699e258019-kube-api-access-bmck4\") pod \"ingress-canary-kc49q\" (UID: \"318f04b7-d7a9-4e8f-9d0f-03699e258019\") " pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030607 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030633 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbg9\" (UniqueName: \"kubernetes.io/projected/86c26994-b813-4c6e-af79-4c35425bc260-kube-api-access-ghbg9\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030682 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-mountpoint-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030704 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dca95b5e-6308-41f2-a01d-9cb5c15b8607-srv-cert\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d62a0f2d-49ef-4291-a14d-940a6215347c-tmpfs\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030763 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-plugins-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030798 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bbf025a-0f57-480a-80ec-4e21b1ef4e69-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4485l\" (UID: \"4bbf025a-0f57-480a-80ec-4e21b1ef4e69\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030823 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-secret-volume\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030868 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2fa8fe7-d057-4386-9121-f4c838bd1a76-proxy-tls\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030890 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-service-ca\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030892 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-ca\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030964 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-registration-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030820 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-mountpoint-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031003 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-client\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.031087 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.53107072 +0000 UTC m=+134.223881393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031117 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-config-volume\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031153 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92lf5\" (UniqueName: \"kubernetes.io/projected/fabcc99d-46ca-4172-918e-b5038c4a001a-kube-api-access-92lf5\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.030897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-plugins-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031180 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/318f04b7-d7a9-4e8f-9d0f-03699e258019-cert\") pod \"ingress-canary-kc49q\" (UID: \"318f04b7-d7a9-4e8f-9d0f-03699e258019\") " pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031337 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sghwc\" (UniqueName: \"kubernetes.io/projected/d0da4600-abf4-4f3e-8299-a269b29ca44a-kube-api-access-sghwc\") pod \"control-plane-machine-set-operator-78cbb6b69f-9vmlr\" (UID: \"d0da4600-abf4-4f3e-8299-a269b29ca44a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031398 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hcsl\" (UniqueName: \"kubernetes.io/projected/4bbf025a-0f57-480a-80ec-4e21b1ef4e69-kube-api-access-6hcsl\") pod \"package-server-manager-789f6589d5-4485l\" (UID: \"4bbf025a-0f57-480a-80ec-4e21b1ef4e69\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031428 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-socket-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031478 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdtr\" (UniqueName: \"kubernetes.io/projected/9f1ba963-c886-46de-9428-a18cc4b13eeb-kube-api-access-xrdtr\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031511 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bcf6b7f-1936-4261-a1ee-907951cb68f3-srv-cert\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031574 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031611 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c26994-b813-4c6e-af79-4c35425bc260-config\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031812 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9bcf6b7f-1936-4261-a1ee-907951cb68f3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031839 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c26994-b813-4c6e-af79-4c35425bc260-serving-cert\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031881 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jbs5\" (UniqueName: \"kubernetes.io/projected/b5e11824-a189-4e46-9548-641ef4dbe7fe-kube-api-access-9jbs5\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031903 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0da4600-abf4-4f3e-8299-a269b29ca44a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9vmlr\" (UID: \"d0da4600-abf4-4f3e-8299-a269b29ca44a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031924 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b5e11824-a189-4e46-9548-641ef4dbe7fe-signing-key\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031953 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-csi-data-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.031975 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8374c82a-8339-42ff-b216-9ffc87a4d710-serving-cert\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032000 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d62a0f2d-49ef-4291-a14d-940a6215347c-apiservice-cert\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032022 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d62a0f2d-49ef-4291-a14d-940a6215347c-webhook-cert\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032042 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fabcc99d-46ca-4172-918e-b5038c4a001a-certs\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032061 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fabcc99d-46ca-4172-918e-b5038c4a001a-node-bootstrap-token\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032078 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2hq\" (UniqueName: \"kubernetes.io/projected/5dd3b6ce-7492-4273-8d1b-86dc10a03404-kube-api-access-7r2hq\") pod \"migrator-59844c95c7-r7zkf\" (UID: \"5dd3b6ce-7492-4273-8d1b-86dc10a03404\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032097 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7cd4\" (UniqueName: \"kubernetes.io/projected/d62a0f2d-49ef-4291-a14d-940a6215347c-kube-api-access-d7cd4\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032114 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b5e11824-a189-4e46-9548-641ef4dbe7fe-signing-cabundle\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032153 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsk6x\" (UniqueName: \"kubernetes.io/projected/dca95b5e-6308-41f2-a01d-9cb5c15b8607-kube-api-access-fsk6x\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032176 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7cv\" (UniqueName: \"kubernetes.io/projected/f80324fe-a4f1-44ba-a946-acade1dcb4e0-kube-api-access-tw7cv\") pod \"multus-admission-controller-857f4d67dd-dtnh6\" (UID: \"f80324fe-a4f1-44ba-a946-acade1dcb4e0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032220 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f1ba963-c886-46de-9428-a18cc4b13eeb-metrics-tls\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032241 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f1ba963-c886-46de-9428-a18cc4b13eeb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-config\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032285 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8f55\" (UniqueName: \"kubernetes.io/projected/6364f6d6-6f56-43a4-af6c-9417865f326e-kube-api-access-p8f55\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032316 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dca95b5e-6308-41f2-a01d-9cb5c15b8607-profile-collector-cert\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032348 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f80324fe-a4f1-44ba-a946-acade1dcb4e0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dtnh6\" (UID: \"f80324fe-a4f1-44ba-a946-acade1dcb4e0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032378 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spk62\" (UniqueName: \"kubernetes.io/projected/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-kube-api-access-spk62\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032405 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2fa8fe7-d057-4386-9121-f4c838bd1a76-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032428 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032481 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh69r\" (UniqueName: \"kubernetes.io/projected/ce159705-9661-4510-a5b8-9e7ac58e524c-kube-api-access-fh69r\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032544 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d62a0f2d-49ef-4291-a14d-940a6215347c-tmpfs\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.032753 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-ca\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.033115 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f1ba963-c886-46de-9428-a18cc4b13eeb-trusted-ca\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.033465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-socket-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.033746 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c26994-b813-4c6e-af79-4c35425bc260-config\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.034204 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.034569 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dca95b5e-6308-41f2-a01d-9cb5c15b8607-srv-cert\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.034578 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8374c82a-8339-42ff-b216-9ffc87a4d710-etcd-client\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.035019 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nd8\" (UniqueName: \"kubernetes.io/projected/3434800b-823a-4050-863e-1d2240a29709-kube-api-access-d7nd8\") pod \"cluster-image-registry-operator-dc59b4c8b-2k2t4\" (UID: \"3434800b-823a-4050-863e-1d2240a29709\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.036340 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2fa8fe7-d057-4386-9121-f4c838bd1a76-proxy-tls\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.036539 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-config-volume\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.036584 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9bcf6b7f-1936-4261-a1ee-907951cb68f3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.036913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/318f04b7-d7a9-4e8f-9d0f-03699e258019-cert\") pod \"ingress-canary-kc49q\" (UID: \"318f04b7-d7a9-4e8f-9d0f-03699e258019\") " pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.037048 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6364f6d6-6f56-43a4-af6c-9417865f326e-csi-data-dir\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.036927 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-secret-volume\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.037626 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0da4600-abf4-4f3e-8299-a269b29ca44a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9vmlr\" (UID: \"d0da4600-abf4-4f3e-8299-a269b29ca44a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.037677 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f1ba963-c886-46de-9428-a18cc4b13eeb-metrics-tls\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.038219 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2fa8fe7-d057-4386-9121-f4c838bd1a76-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.039740 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8374c82a-8339-42ff-b216-9ffc87a4d710-config\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dca95b5e-6308-41f2-a01d-9cb5c15b8607-profile-collector-cert\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040144 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bbf025a-0f57-480a-80ec-4e21b1ef4e69-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4485l\" (UID: \"4bbf025a-0f57-480a-80ec-4e21b1ef4e69\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040499 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c26994-b813-4c6e-af79-4c35425bc260-serving-cert\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040590 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040612 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f80324fe-a4f1-44ba-a946-acade1dcb4e0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dtnh6\" (UID: \"f80324fe-a4f1-44ba-a946-acade1dcb4e0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040799 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.040567 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fabcc99d-46ca-4172-918e-b5038c4a001a-certs\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.041238 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bcf6b7f-1936-4261-a1ee-907951cb68f3-srv-cert\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.041760 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b5e11824-a189-4e46-9548-641ef4dbe7fe-signing-cabundle\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.042042 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d62a0f2d-49ef-4291-a14d-940a6215347c-apiservice-cert\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.042607 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d62a0f2d-49ef-4291-a14d-940a6215347c-webhook-cert\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.042820 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8374c82a-8339-42ff-b216-9ffc87a4d710-serving-cert\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.042917 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fabcc99d-46ca-4172-918e-b5038c4a001a-node-bootstrap-token\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.043137 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b5e11824-a189-4e46-9548-641ef4dbe7fe-signing-key\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.073582 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8krmk\" (UniqueName: \"kubernetes.io/projected/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-kube-api-access-8krmk\") pod \"console-f9d7485db-4m5k5\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.091111 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldsmv\" (UniqueName: \"kubernetes.io/projected/3b104989-912e-43b1-a295-2ea8eb157e77-kube-api-access-ldsmv\") pod \"downloads-7954f5f757-z99h8\" (UID: \"3b104989-912e-43b1-a295-2ea8eb157e77\") " pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.095783 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.111494 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.114907 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6625254f-fe58-44a5-b467-9ed7544a7902-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m54gb\" (UID: \"6625254f-fe58-44a5-b467-9ed7544a7902\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.133117 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.133312 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8rt7\" (UniqueName: \"kubernetes.io/projected/533e765f-918a-456e-8803-dd3e8495f20f-kube-api-access-k8rt7\") pod \"machine-approver-56656f9798-xjlqv\" (UID: \"533e765f-918a-456e-8803-dd3e8495f20f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.133488 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.633446562 +0000 UTC m=+134.326257275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.133938 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.134230 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.634218591 +0000 UTC m=+134.327029264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.136250 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.152997 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pxgj\" (UniqueName: \"kubernetes.io/projected/1d1c512c-76d2-4775-980b-47205a8e55e2-kube-api-access-8pxgj\") pod \"console-operator-58897d9998-ps5hq\" (UID: \"1d1c512c-76d2-4775-980b-47205a8e55e2\") " pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.155238 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.161730 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.163284 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.174850 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.177129 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.179265 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.185890 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.199928 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.218996 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.235647 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.235692 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.735650111 +0000 UTC m=+134.428460784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.236321 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.236664 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.736649414 +0000 UTC m=+134.429460097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.238752 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.244076 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.259680 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.294327 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.324457 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-bound-sa-token\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.338746 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.339236 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.839221331 +0000 UTC m=+134.532032004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.351411 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqr97\" (UniqueName: \"kubernetes.io/projected/3b21de9b-333f-49d6-84b7-a616e217a26d-kube-api-access-fqr97\") pod \"authentication-operator-69f744f599-zh2xc\" (UID: \"3b21de9b-333f-49d6-84b7-a616e217a26d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.356652 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4m5k5"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.363789 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4bvp\" (UniqueName: \"kubernetes.io/projected/efd51828-51ba-4723-bd28-306e48ce1a54-kube-api-access-k4bvp\") pod \"apiserver-7bbb656c7d-mv26p\" (UID: \"efd51828-51ba-4723-bd28-306e48ce1a54\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.375708 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkn2p\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-kube-api-access-rkn2p\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.375947 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.387898 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-25b8x"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.395675 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2pxl\" (UniqueName: \"kubernetes.io/projected/0858a203-42e8-4108-a0bd-48ba190bf420-kube-api-access-z2pxl\") pod \"controller-manager-879f6c89f-dtqw9\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.414505 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzxn\" (UniqueName: \"kubernetes.io/projected/d2628b9d-5e3f-412f-b3c7-f24e6a208577-kube-api-access-7wzxn\") pod \"dns-default-sgf44\" (UID: \"d2628b9d-5e3f-412f-b3c7-f24e6a208577\") " pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:54 crc kubenswrapper[4669]: W1008 20:46:54.420126 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533e765f_918a_456e_8803_dd3e8495f20f.slice/crio-1a6cd9b36e14161fa34a2b7bc40f4b602dbf0348f12dbb7955bc092e56d5744c WatchSource:0}: Error finding container 1a6cd9b36e14161fa34a2b7bc40f4b602dbf0348f12dbb7955bc092e56d5744c: Status 404 returned error can't find the container with id 1a6cd9b36e14161fa34a2b7bc40f4b602dbf0348f12dbb7955bc092e56d5744c Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.440712 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.441137 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:54.941122872 +0000 UTC m=+134.633933555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.446825 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.456271 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee97e8a2-071a-48f9-a13f-d66fb60bbb55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lgt96\" (UID: \"ee97e8a2-071a-48f9-a13f-d66fb60bbb55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.463190 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8xnc\" (UniqueName: \"kubernetes.io/projected/21147ccd-e3af-44e0-a7fd-2931e731cc53-kube-api-access-t8xnc\") pod \"machine-config-operator-74547568cd-67zzr\" (UID: \"21147ccd-e3af-44e0-a7fd-2931e731cc53\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.464664 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.472932 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s292x\" (UniqueName: \"kubernetes.io/projected/c6254786-5db9-423f-88ed-f42edefc70a8-kube-api-access-s292x\") pod \"dns-operator-744455d44c-xndc2\" (UID: \"c6254786-5db9-423f-88ed-f42edefc70a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.495910 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ab0571a-8beb-446a-9f96-b9bc226d5b22-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7b2xv\" (UID: \"3ab0571a-8beb-446a-9f96-b9bc226d5b22\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.516955 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsstc\" (UniqueName: \"kubernetes.io/projected/7a558b91-227c-4e11-aa99-4406e545a2ea-kube-api-access-lsstc\") pod \"openshift-controller-manager-operator-756b6f6bc6-sv8nt\" (UID: \"7a558b91-227c-4e11-aa99-4406e545a2ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.538969 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8tjh\" (UniqueName: \"kubernetes.io/projected/e45ff91e-f07d-489b-b0a0-8e815bdf41c3-kube-api-access-s8tjh\") pod \"machine-api-operator-5694c8668f-rtm6r\" (UID: \"e45ff91e-f07d-489b-b0a0-8e815bdf41c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.554093 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.554190 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.554623 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.054600964 +0000 UTC m=+134.747411637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.558167 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6zm\" (UniqueName: \"kubernetes.io/projected/672eeff6-8079-4f9a-a61c-40094de694be-kube-api-access-6m6zm\") pod \"router-default-5444994796-db8rh\" (UID: \"672eeff6-8079-4f9a-a61c-40094de694be\") " pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.564849 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.576974 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.579379 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tqq\" (UniqueName: \"kubernetes.io/projected/96a22bbb-2450-42e2-8c27-8246a89fbb6e-kube-api-access-q9tqq\") pod \"openshift-config-operator-7777fb866f-74lcm\" (UID: \"96a22bbb-2450-42e2-8c27-8246a89fbb6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.583781 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.583790 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.590455 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.593137 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blrlq\" (UniqueName: \"kubernetes.io/projected/9b77bb83-431f-4431-93cc-04be87ab96dc-kube-api-access-blrlq\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwdg7\" (UID: \"9b77bb83-431f-4431-93cc-04be87ab96dc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.602953 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.622695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t224m\" (UniqueName: \"kubernetes.io/projected/8374c82a-8339-42ff-b216-9ffc87a4d710-kube-api-access-t224m\") pod \"etcd-operator-b45778765-4nbtf\" (UID: \"8374c82a-8339-42ff-b216-9ffc87a4d710\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.634044 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.647517 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmck4\" (UniqueName: \"kubernetes.io/projected/318f04b7-d7a9-4e8f-9d0f-03699e258019-kube-api-access-bmck4\") pod \"ingress-canary-kc49q\" (UID: \"318f04b7-d7a9-4e8f-9d0f-03699e258019\") " pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.647740 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.655630 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.655977 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.155964103 +0000 UTC m=+134.848774776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.661340 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wxt\" (UniqueName: \"kubernetes.io/projected/e2fa8fe7-d057-4386-9121-f4c838bd1a76-kube-api-access-65wxt\") pod \"machine-config-controller-84d6567774-tsh5h\" (UID: \"e2fa8fe7-d057-4386-9121-f4c838bd1a76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.661425 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.668156 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.674931 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.677165 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sgf44"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.680733 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghbg9\" (UniqueName: \"kubernetes.io/projected/86c26994-b813-4c6e-af79-4c35425bc260-kube-api-access-ghbg9\") pod \"service-ca-operator-777779d784-wn4pf\" (UID: \"86c26994-b813-4c6e-af79-4c35425bc260\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.693832 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6l6tw"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.702042 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.703765 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5pc\" (UniqueName: \"kubernetes.io/projected/9bcf6b7f-1936-4261-a1ee-907951cb68f3-kube-api-access-fc5pc\") pod \"olm-operator-6b444d44fb-449s9\" (UID: \"9bcf6b7f-1936-4261-a1ee-907951cb68f3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.705646 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z99h8"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.706967 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.716369 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.720637 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.729722 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jbs5\" (UniqueName: \"kubernetes.io/projected/b5e11824-a189-4e46-9548-641ef4dbe7fe-kube-api-access-9jbs5\") pod \"service-ca-9c57cc56f-wg8jg\" (UID: \"b5e11824-a189-4e46-9548-641ef4dbe7fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.735371 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.738741 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sghwc\" (UniqueName: \"kubernetes.io/projected/d0da4600-abf4-4f3e-8299-a269b29ca44a-kube-api-access-sghwc\") pod \"control-plane-machine-set-operator-78cbb6b69f-9vmlr\" (UID: \"d0da4600-abf4-4f3e-8299-a269b29ca44a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.751242 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kc49q" Oct 08 20:46:54 crc kubenswrapper[4669]: W1008 20:46:54.753661 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5289f930_ba47_4745_8ab7_784863dc110e.slice/crio-1b13e590978d685b4a57d0fa51003545e7f72a614240ecc6ba07cee91c4d21c7 WatchSource:0}: Error finding container 1b13e590978d685b4a57d0fa51003545e7f72a614240ecc6ba07cee91c4d21c7: Status 404 returned error can't find the container with id 1b13e590978d685b4a57d0fa51003545e7f72a614240ecc6ba07cee91c4d21c7 Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.757459 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.757943 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.257927315 +0000 UTC m=+134.950737988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.759465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hcsl\" (UniqueName: \"kubernetes.io/projected/4bbf025a-0f57-480a-80ec-4e21b1ef4e69-kube-api-access-6hcsl\") pod \"package-server-manager-789f6589d5-4485l\" (UID: \"4bbf025a-0f57-480a-80ec-4e21b1ef4e69\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.762828 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ps5hq"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.781360 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh69r\" (UniqueName: \"kubernetes.io/projected/ce159705-9661-4510-a5b8-9e7ac58e524c-kube-api-access-fh69r\") pod \"marketplace-operator-79b997595-4knb8\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.785712 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.795685 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdtr\" (UniqueName: \"kubernetes.io/projected/9f1ba963-c886-46de-9428-a18cc4b13eeb-kube-api-access-xrdtr\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.806938 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.822733 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lf5\" (UniqueName: \"kubernetes.io/projected/fabcc99d-46ca-4172-918e-b5038c4a001a-kube-api-access-92lf5\") pod \"machine-config-server-9fcxn\" (UID: \"fabcc99d-46ca-4172-918e-b5038c4a001a\") " pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.837425 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsk6x\" (UniqueName: \"kubernetes.io/projected/dca95b5e-6308-41f2-a01d-9cb5c15b8607-kube-api-access-fsk6x\") pod \"catalog-operator-68c6474976-dsg9n\" (UID: \"dca95b5e-6308-41f2-a01d-9cb5c15b8607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: W1008 20:46:54.850304 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d1c512c_76d2_4775_980b_47205a8e55e2.slice/crio-b1ff0765bb555f021fd80c707556ce2f48a27a7a852b3676a1f5856ef6c6964e WatchSource:0}: Error finding container b1ff0765bb555f021fd80c707556ce2f48a27a7a852b3676a1f5856ef6c6964e: Status 404 returned error can't find the container with id b1ff0765bb555f021fd80c707556ce2f48a27a7a852b3676a1f5856ef6c6964e Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.856621 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.856855 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8f55\" (UniqueName: \"kubernetes.io/projected/6364f6d6-6f56-43a4-af6c-9417865f326e-kube-api-access-p8f55\") pod \"csi-hostpathplugin-tqhcx\" (UID: \"6364f6d6-6f56-43a4-af6c-9417865f326e\") " pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.858997 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.859337 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.359322625 +0000 UTC m=+135.052133298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.870094 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.875347 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7cv\" (UniqueName: \"kubernetes.io/projected/f80324fe-a4f1-44ba-a946-acade1dcb4e0-kube-api-access-tw7cv\") pod \"multus-admission-controller-857f4d67dd-dtnh6\" (UID: \"f80324fe-a4f1-44ba-a946-acade1dcb4e0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.896894 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.905899 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.907695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk62\" (UniqueName: \"kubernetes.io/projected/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-kube-api-access-spk62\") pod \"collect-profiles-29332605-xxrs4\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.915871 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.916484 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f1ba963-c886-46de-9428-a18cc4b13eeb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f8wcf\" (UID: \"9f1ba963-c886-46de-9428-a18cc4b13eeb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.934481 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2hq\" (UniqueName: \"kubernetes.io/projected/5dd3b6ce-7492-4273-8d1b-86dc10a03404-kube-api-access-7r2hq\") pod \"migrator-59844c95c7-r7zkf\" (UID: \"5dd3b6ce-7492-4273-8d1b-86dc10a03404\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.936864 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.952547 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.954183 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.955856 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7cd4\" (UniqueName: \"kubernetes.io/projected/d62a0f2d-49ef-4291-a14d-940a6215347c-kube-api-access-d7cd4\") pod \"packageserver-d55dfcdfc-r6qfk\" (UID: \"d62a0f2d-49ef-4291-a14d-940a6215347c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.961349 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.961506 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.461480412 +0000 UTC m=+135.154291085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.961644 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:54 crc kubenswrapper[4669]: E1008 20:46:54.962599 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.462581457 +0000 UTC m=+135.155392130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.967696 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zh2xc"] Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.982835 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" Oct 08 20:46:54 crc kubenswrapper[4669]: I1008 20:46:54.994699 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.008831 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.026111 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.027413 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rtm6r"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.032926 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" event={"ID":"01878aca-a23f-4add-bedf-94cc3b75b481","Type":"ContainerStarted","Data":"a8bf932bbfe22df28b4711bb97feedd530a1b86bd27c8d3c1c1458ffdcd6fddb"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.032966 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" event={"ID":"01878aca-a23f-4add-bedf-94cc3b75b481","Type":"ContainerStarted","Data":"b6fe71c590372b5ff3f16378560c6ec2d4e61d3a8a13019df3d624f0db32c00d"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.037026 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" event={"ID":"533e765f-918a-456e-8803-dd3e8495f20f","Type":"ContainerStarted","Data":"1a6cd9b36e14161fa34a2b7bc40f4b602dbf0348f12dbb7955bc092e56d5744c"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.042811 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.048146 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m5k5" event={"ID":"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9","Type":"ContainerStarted","Data":"4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.048198 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m5k5" event={"ID":"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9","Type":"ContainerStarted","Data":"876acf8b4b48298878be1a417e99d2ff1ad43768c47dbd52266802959c086994"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.049447 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtqw9"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.050918 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.051775 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-db8rh" event={"ID":"672eeff6-8079-4f9a-a61c-40094de694be","Type":"ContainerStarted","Data":"04715b653b5dba00c52b871e73e67697d3f08211290e4e6fb7b23230352f8d7b"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.062783 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" event={"ID":"6625254f-fe58-44a5-b467-9ed7544a7902","Type":"ContainerStarted","Data":"eaa9e20d882f75da2ea2155527a2fca102c79165b0bcb850af4a77f0c6b0ffbd"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.064461 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" event={"ID":"e8b346cb-e447-4473-a94d-a66882c3af6f","Type":"ContainerStarted","Data":"14eb61b0b0362a0e529608aff1bec596ccc1fb423fb5325d1e2ff048dadd5669"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.065353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" event={"ID":"3434800b-823a-4050-863e-1d2240a29709","Type":"ContainerStarted","Data":"3e752a03d4f6f57c3b8a3299fc47458fb5d8501168c1f685953a4f9d20bfc547"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.068498 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.068857 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.568838531 +0000 UTC m=+135.261649194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.069050 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.069543 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" event={"ID":"2f8ab351-ab50-4289-8d0f-aae3ade74644","Type":"ContainerStarted","Data":"9385c8489f9a56f4bcc4d4297cc60610f9749cb657ec6388b553392f3215dc46"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.069573 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" event={"ID":"2f8ab351-ab50-4289-8d0f-aae3ade74644","Type":"ContainerStarted","Data":"48c0792bc5aa5f2942c84f69ac8e7fd1856442b2ccda97ef080ce73dcc2cda33"} Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.069697 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.569681461 +0000 UTC m=+135.262492134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.069798 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.071362 4669 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4xg9t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.071408 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" podUID="2f8ab351-ab50-4289-8d0f-aae3ade74644" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.075180 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z99h8" event={"ID":"3b104989-912e-43b1-a295-2ea8eb157e77","Type":"ContainerStarted","Data":"595a75aa59bd3fcd634c3b56a17aed8204cf36b9acb0bd293b4b85e6ce470206"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.075857 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9fcxn" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.078575 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" event={"ID":"1d1c512c-76d2-4775-980b-47205a8e55e2","Type":"ContainerStarted","Data":"b1ff0765bb555f021fd80c707556ce2f48a27a7a852b3676a1f5856ef6c6964e"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.083713 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sgf44" event={"ID":"d2628b9d-5e3f-412f-b3c7-f24e6a208577","Type":"ContainerStarted","Data":"5df04179983344457253a2f7659cef00f28b76f538ca5b2efd0f6145d188c9da"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.095453 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" event={"ID":"5289f930-ba47-4745-8ab7-784863dc110e","Type":"ContainerStarted","Data":"1b13e590978d685b4a57d0fa51003545e7f72a614240ecc6ba07cee91c4d21c7"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.097914 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" event={"ID":"efd51828-51ba-4723-bd28-306e48ce1a54","Type":"ContainerStarted","Data":"2a58affbcee8b1bbd9e3658007e808c64c36ee197c45223ebab40ce54d6d6c2a"} Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.169605 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.169801 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.669780379 +0000 UTC m=+135.362591052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.170150 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.172457 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.672443562 +0000 UTC m=+135.365254235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: W1008 20:46:55.208282 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45ff91e_f07d_489b_b0a0_8e815bdf41c3.slice/crio-9dc6a9f9ed417c807754320bcadf6ba04c4044ac85637824dfc55d1916c778f6 WatchSource:0}: Error finding container 9dc6a9f9ed417c807754320bcadf6ba04c4044ac85637824dfc55d1916c778f6: Status 404 returned error can't find the container with id 9dc6a9f9ed417c807754320bcadf6ba04c4044ac85637824dfc55d1916c778f6 Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.216019 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.220608 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.271608 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.272080 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.77206535 +0000 UTC m=+135.464876023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.301655 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.374748 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.375810 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.875144129 +0000 UTC m=+135.567954792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.385979 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.431938 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.458213 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.466801 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xndc2"] Oct 08 20:46:55 crc kubenswrapper[4669]: W1008 20:46:55.470610 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b77bb83_431f_4431_93cc_04be87ab96dc.slice/crio-2f2bccc9e223f0d85e96c916f489e815cd2a1252eaa69a2ac62d49bdc7593430 WatchSource:0}: Error finding container 2f2bccc9e223f0d85e96c916f489e815cd2a1252eaa69a2ac62d49bdc7593430: Status 404 returned error can't find the container with id 2f2bccc9e223f0d85e96c916f489e815cd2a1252eaa69a2ac62d49bdc7593430 Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.475935 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.476310 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:55.976288763 +0000 UTC m=+135.669099436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.508060 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.521824 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-74lcm"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.529321 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kc49q"] Oct 08 20:46:55 crc kubenswrapper[4669]: W1008 20:46:55.548377 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6254786_5db9_423f_88ed_f42edefc70a8.slice/crio-950b92e7aff722cd9390982178f6526ad2c05b869515d2dde8e5e16f5b985931 WatchSource:0}: Error finding container 950b92e7aff722cd9390982178f6526ad2c05b869515d2dde8e5e16f5b985931: Status 404 returned error can't find the container with id 950b92e7aff722cd9390982178f6526ad2c05b869515d2dde8e5e16f5b985931 Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.578469 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.578853 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.078837628 +0000 UTC m=+135.771648361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.614369 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4rzvg" podStartSLOduration=113.614349659 podStartE2EDuration="1m53.614349659s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:55.569437859 +0000 UTC m=+135.262248532" watchObservedRunningTime="2025-10-08 20:46:55.614349659 +0000 UTC m=+135.307160342" Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.679434 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.679831 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.179810878 +0000 UTC m=+135.872621551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.781419 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.781861 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.281847853 +0000 UTC m=+135.974658526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.832034 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l"] Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.883788 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.885132 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.385101026 +0000 UTC m=+136.077911699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:55 crc kubenswrapper[4669]: I1008 20:46:55.985475 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:55 crc kubenswrapper[4669]: E1008 20:46:55.985743 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.485732338 +0000 UTC m=+136.178543011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.002800 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4nbtf"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.011655 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tqhcx"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.068881 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.083052 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.087453 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.087953 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.587934665 +0000 UTC m=+136.280745348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: W1008 20:46:56.096305 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8374c82a_8339_42ff_b216_9ffc87a4d710.slice/crio-5cbdc2cad76a89c45eb9beb31764c4473c85a78fa3456520c225104becf2564b WatchSource:0}: Error finding container 5cbdc2cad76a89c45eb9beb31764c4473c85a78fa3456520c225104becf2564b: Status 404 returned error can't find the container with id 5cbdc2cad76a89c45eb9beb31764c4473c85a78fa3456520c225104becf2564b Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.163629 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" event={"ID":"533e765f-918a-456e-8803-dd3e8495f20f","Type":"ContainerStarted","Data":"042d4d4c3e483104bfa0f10068be88ac5cc80e2e46401383420331b165f5f05a"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.167808 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z99h8" event={"ID":"3b104989-912e-43b1-a295-2ea8eb157e77","Type":"ContainerStarted","Data":"b66e354ee30e936a386e3eb924a9fef6ed86602dc429ea193098cd8a5aa463f6"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.168112 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4knb8"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.168660 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.176234 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" podStartSLOduration=113.176219419 podStartE2EDuration="1m53.176219419s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.173882864 +0000 UTC m=+135.866693537" watchObservedRunningTime="2025-10-08 20:46:56.176219419 +0000 UTC m=+135.869030092" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.177890 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" event={"ID":"3b21de9b-333f-49d6-84b7-a616e217a26d","Type":"ContainerStarted","Data":"5fe99c28e23261b6f3da0339c4a8083759088e3a4753b13d8bfa01cecd17ecf7"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.177951 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" event={"ID":"3b21de9b-333f-49d6-84b7-a616e217a26d","Type":"ContainerStarted","Data":"536800e720b438931b9f82948d062ec16fa56252dd238f9536d36b15f663b79f"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.185388 4669 generic.go:334] "Generic (PLEG): container finished" podID="e8b346cb-e447-4473-a94d-a66882c3af6f" containerID="58d6167c77085cd6ec775fbd63a9880ecb344cd55f35e910ef731731228c9d05" exitCode=0 Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.185471 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" event={"ID":"e8b346cb-e447-4473-a94d-a66882c3af6f","Type":"ContainerDied","Data":"58d6167c77085cd6ec775fbd63a9880ecb344cd55f35e910ef731731228c9d05"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.194084 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.196194 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.696178385 +0000 UTC m=+136.388989058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.198074 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kc49q" event={"ID":"318f04b7-d7a9-4e8f-9d0f-03699e258019","Type":"ContainerStarted","Data":"4662c7615c80211f4e877c1605d89a2a9abe8ea90f5eeedd14457d4b9b39614b"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.200662 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.201587 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" event={"ID":"e45ff91e-f07d-489b-b0a0-8e815bdf41c3","Type":"ContainerStarted","Data":"9dc6a9f9ed417c807754320bcadf6ba04c4044ac85637824dfc55d1916c778f6"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.205316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" event={"ID":"0858a203-42e8-4108-a0bd-48ba190bf420","Type":"ContainerStarted","Data":"a100fb722eaa96761f6d3ec375efe2257b0f6fa9715c17d4be90cb48925c92ea"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.209245 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-z99h8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.209424 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z99h8" podUID="3b104989-912e-43b1-a295-2ea8eb157e77" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.215027 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" event={"ID":"3ab0571a-8beb-446a-9f96-b9bc226d5b22","Type":"ContainerStarted","Data":"d94b40d2d28c4f898b1bd2341842aca5ebe0b6cbdd7c1fdccb1b4e1afcaed16c"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.217786 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" event={"ID":"96a22bbb-2450-42e2-8c27-8246a89fbb6e","Type":"ContainerStarted","Data":"8bc154f01314c499930188d96b37c928cbd49311d594277116d392dd8c97fa80"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.229696 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" event={"ID":"e2fa8fe7-d057-4386-9121-f4c838bd1a76","Type":"ContainerStarted","Data":"8d60e76469f9943a292a01d1399d38b088604e6bbe3fc85de96aae3fde6367c4"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.244162 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" event={"ID":"ee97e8a2-071a-48f9-a13f-d66fb60bbb55","Type":"ContainerStarted","Data":"b77fac4e1b4b4431fe7e74a93647a5c9cbb23c78cc284be6a3a58c8b6bccea04"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.259219 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" event={"ID":"4bbf025a-0f57-480a-80ec-4e21b1ef4e69","Type":"ContainerStarted","Data":"59640273ff8a783190c4b1a58a571ecbe73281a308223c5cf4040c1a377e39f9"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.260849 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" event={"ID":"86c26994-b813-4c6e-af79-4c35425bc260","Type":"ContainerStarted","Data":"4edbc44b695a3e83e7a83bbedad4c62b2585b861f06fe5223dbad0f051c133bf"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.270169 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" event={"ID":"21147ccd-e3af-44e0-a7fd-2931e731cc53","Type":"ContainerStarted","Data":"79d8b193ae3c28d2a07c09f8c504c98f25a5b84dd7beeb7879d3be763afb8c90"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.299064 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4m5k5" podStartSLOduration=114.299046638 podStartE2EDuration="1m54.299046638s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.296071729 +0000 UTC m=+135.988882412" watchObservedRunningTime="2025-10-08 20:46:56.299046638 +0000 UTC m=+135.991857311" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.299301 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wg8jg"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.299807 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.300381 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.800341259 +0000 UTC m=+136.493151932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.321982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" event={"ID":"3434800b-823a-4050-863e-1d2240a29709","Type":"ContainerStarted","Data":"dca001d6d6e40e3ab24a6c74defdbf19500c72b6cf231958db746dad623c3b52"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.333485 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" event={"ID":"9b77bb83-431f-4431-93cc-04be87ab96dc","Type":"ContainerStarted","Data":"2f2bccc9e223f0d85e96c916f489e815cd2a1252eaa69a2ac62d49bdc7593430"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.345407 4669 generic.go:334] "Generic (PLEG): container finished" podID="efd51828-51ba-4723-bd28-306e48ce1a54" containerID="d801bcaa26de6d38b6468c36c64e46f9ffe3b2d0f27d48d666a7368cc6007a14" exitCode=0 Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.345588 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" event={"ID":"efd51828-51ba-4723-bd28-306e48ce1a54","Type":"ContainerDied","Data":"d801bcaa26de6d38b6468c36c64e46f9ffe3b2d0f27d48d666a7368cc6007a14"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.350595 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dtnh6"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.351927 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" event={"ID":"6625254f-fe58-44a5-b467-9ed7544a7902","Type":"ContainerStarted","Data":"2d004a5445e23c1cdafe50d8039b009bc5b551b4ff3d21b17bdfc1893150003b"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.363765 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.366718 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.369150 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr"] Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.402268 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:56 crc kubenswrapper[4669]: W1008 20:46:56.403147 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e11824_a189_4e46_9548_641ef4dbe7fe.slice/crio-4bfc310abec3ab41569d56e0b6ee17decf9f526a3bee63aed61490ff52efb5df WatchSource:0}: Error finding container 4bfc310abec3ab41569d56e0b6ee17decf9f526a3bee63aed61490ff52efb5df: Status 404 returned error can't find the container with id 4bfc310abec3ab41569d56e0b6ee17decf9f526a3bee63aed61490ff52efb5df Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.408257 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:56.9082233 +0000 UTC m=+136.601033973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.430453 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sgf44" event={"ID":"d2628b9d-5e3f-412f-b3c7-f24e6a208577","Type":"ContainerStarted","Data":"a4877eb1c71f383d76ac05c9891420bb5ee5517906ed55b16a965f838600670b"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.442445 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" event={"ID":"8374c82a-8339-42ff-b216-9ffc87a4d710","Type":"ContainerStarted","Data":"5cbdc2cad76a89c45eb9beb31764c4473c85a78fa3456520c225104becf2564b"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.446126 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" event={"ID":"b0dc8a08-0e25-4dba-9e96-089c46deb679","Type":"ContainerStarted","Data":"6d7af1d0d744c5fce5e1e1b20cf11ecff189be2b75bba551070c3b208adb5d98"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.446172 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" event={"ID":"b0dc8a08-0e25-4dba-9e96-089c46deb679","Type":"ContainerStarted","Data":"3e7211e1f01d333d37e6bef0890d177f04c6cf50d84b3c5698ab4fe2a58960ee"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.478462 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" event={"ID":"c6254786-5db9-423f-88ed-f42edefc70a8","Type":"ContainerStarted","Data":"950b92e7aff722cd9390982178f6526ad2c05b869515d2dde8e5e16f5b985931"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.481032 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-db8rh" event={"ID":"672eeff6-8079-4f9a-a61c-40094de694be","Type":"ContainerStarted","Data":"ba2c268063d074fa3d48aff492fe3b3899e03d3829d7e866b5bc53dfe904e339"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.483247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" event={"ID":"7a558b91-227c-4e11-aa99-4406e545a2ea","Type":"ContainerStarted","Data":"53a96e1f50eebb911f3f690d0b136d9f334498233b26943a7067122b81138dc0"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.489926 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" event={"ID":"1d1c512c-76d2-4775-980b-47205a8e55e2","Type":"ContainerStarted","Data":"9205f1e49841bdbd03871c6d3375be5307e94cff383f7c4ab569f5d93ced1311"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.490355 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.507513 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.507893 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.007872269 +0000 UTC m=+136.700682942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.508044 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.509855 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.009846675 +0000 UTC m=+136.702657348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.511370 4669 patch_prober.go:28] interesting pod/console-operator-58897d9998-ps5hq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.511420 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" podUID="1d1c512c-76d2-4775-980b-47205a8e55e2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.518309 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9fcxn" event={"ID":"fabcc99d-46ca-4172-918e-b5038c4a001a","Type":"ContainerStarted","Data":"738a5327b53da9899e8b97dbea0eb722e84ab97e622c8271671483644c240bf4"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.535560 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" event={"ID":"9bcf6b7f-1936-4261-a1ee-907951cb68f3","Type":"ContainerStarted","Data":"e872f098d0ecf3029c3084a3586bde9f99dc7931c21d2682e0a49d2691e650ae"} Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.537458 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.541402 4669 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-449s9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.541475 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" podUID="9bcf6b7f-1936-4261-a1ee-907951cb68f3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.565747 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.570382 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:46:56 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:46:56 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:46:56 crc kubenswrapper[4669]: healthz check failed Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.570680 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.604500 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.608731 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.610013 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.109991755 +0000 UTC m=+136.802802428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: W1008 20:46:56.615328 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0da4600_abf4_4f3e_8299_a269b29ca44a.slice/crio-20edbb19518383def530de391a6b2455a2a68270d88f9ede686820b80d91df35 WatchSource:0}: Error finding container 20edbb19518383def530de391a6b2455a2a68270d88f9ede686820b80d91df35: Status 404 returned error can't find the container with id 20edbb19518383def530de391a6b2455a2a68270d88f9ede686820b80d91df35 Oct 08 20:46:56 crc kubenswrapper[4669]: W1008 20:46:56.618593 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62a0f2d_49ef_4291_a14d_940a6215347c.slice/crio-3e476465ae9633b78ee1b1cc9319e8e3ac1e9cb149e80974c78e7f8a51f3eb67 WatchSource:0}: Error finding container 3e476465ae9633b78ee1b1cc9319e8e3ac1e9cb149e80974c78e7f8a51f3eb67: Status 404 returned error can't find the container with id 3e476465ae9633b78ee1b1cc9319e8e3ac1e9cb149e80974c78e7f8a51f3eb67 Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.700727 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zh2xc" podStartSLOduration=114.700709975 podStartE2EDuration="1m54.700709975s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.69790015 +0000 UTC m=+136.390710813" watchObservedRunningTime="2025-10-08 20:46:56.700709975 +0000 UTC m=+136.393520648" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.710854 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.711434 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.211211601 +0000 UTC m=+136.904022274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.738828 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-z99h8" podStartSLOduration=114.738807175 podStartE2EDuration="1m54.738807175s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.734400793 +0000 UTC m=+136.427211466" watchObservedRunningTime="2025-10-08 20:46:56.738807175 +0000 UTC m=+136.431617848" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.775228 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2k2t4" podStartSLOduration=114.775169815 podStartE2EDuration="1m54.775169815s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.772744669 +0000 UTC m=+136.465555352" watchObservedRunningTime="2025-10-08 20:46:56.775169815 +0000 UTC m=+136.467980488" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.812294 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.812969 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.312949258 +0000 UTC m=+137.005759931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.823134 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m54gb" podStartSLOduration=113.823111696 podStartE2EDuration="1m53.823111696s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.820973626 +0000 UTC m=+136.513784299" watchObservedRunningTime="2025-10-08 20:46:56.823111696 +0000 UTC m=+136.515922379" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.848559 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9fcxn" podStartSLOduration=4.84854142 podStartE2EDuration="4.84854142s" podCreationTimestamp="2025-10-08 20:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.847745681 +0000 UTC m=+136.540556374" watchObservedRunningTime="2025-10-08 20:46:56.84854142 +0000 UTC m=+136.541352093" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.900134 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" podStartSLOduration=114.900110665 podStartE2EDuration="1m54.900110665s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:56.89604982 +0000 UTC m=+136.588860503" watchObservedRunningTime="2025-10-08 20:46:56.900110665 +0000 UTC m=+136.592921338" Oct 08 20:46:56 crc kubenswrapper[4669]: I1008 20:46:56.914517 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:56 crc kubenswrapper[4669]: E1008 20:46:56.915074 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.415057914 +0000 UTC m=+137.107868597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.015681 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.016013 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.515998683 +0000 UTC m=+137.208809356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.016212 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.016444 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.516437664 +0000 UTC m=+137.209248337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.018049 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-db8rh" podStartSLOduration=115.01802479 podStartE2EDuration="1m55.01802479s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.014250882 +0000 UTC m=+136.707061555" watchObservedRunningTime="2025-10-08 20:46:57.01802479 +0000 UTC m=+136.710835473" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.118203 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.132168 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.632142767 +0000 UTC m=+137.324953440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.132539 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.132987 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.632977666 +0000 UTC m=+137.325788339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.145665 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" podStartSLOduration=115.145642703 podStartE2EDuration="1m55.145642703s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.096567186 +0000 UTC m=+136.789377879" watchObservedRunningTime="2025-10-08 20:46:57.145642703 +0000 UTC m=+136.838453376" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.158635 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" podStartSLOduration=114.158617385 podStartE2EDuration="1m54.158617385s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.146905682 +0000 UTC m=+136.839716355" watchObservedRunningTime="2025-10-08 20:46:57.158617385 +0000 UTC m=+136.851428058" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.233481 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.233659 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.733639359 +0000 UTC m=+137.426450042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.233870 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.234225 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.734214322 +0000 UTC m=+137.427024995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.334636 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.334955 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.834931396 +0000 UTC m=+137.527742069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.335366 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.335767 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.835759446 +0000 UTC m=+137.528570119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.440062 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.440425 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.940304148 +0000 UTC m=+137.633114821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.440575 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.440964 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:57.940942693 +0000 UTC m=+137.633753366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.541735 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.541974 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.041945643 +0000 UTC m=+137.734756326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.542194 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.542616 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.042599248 +0000 UTC m=+137.735409921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.548594 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sgf44" event={"ID":"d2628b9d-5e3f-412f-b3c7-f24e6a208577","Type":"ContainerStarted","Data":"505a4e9161ca824d43c0bc3ea26641ca3de5a9973ad91b7d528ae11d23ce9cda"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.548735 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-sgf44" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.550284 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" event={"ID":"ce159705-9661-4510-a5b8-9e7ac58e524c","Type":"ContainerStarted","Data":"3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.550334 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" event={"ID":"ce159705-9661-4510-a5b8-9e7ac58e524c","Type":"ContainerStarted","Data":"5f57d67fc8e3324e6a2d250332cadc0e68f6c38def8a19601c6eba4cc087332f"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.550509 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.561903 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" event={"ID":"5289f930-ba47-4745-8ab7-784863dc110e","Type":"ContainerStarted","Data":"b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.562057 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4knb8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.562111 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.562277 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.566047 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" event={"ID":"8374c82a-8339-42ff-b216-9ffc87a4d710","Type":"ContainerStarted","Data":"31c50d865a782f8311f255e0325cf9aa2fbba716fcf91d00ed3a78d3a905606f"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.566074 4669 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6l6tw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.566156 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" podUID="5289f930-ba47-4745-8ab7-784863dc110e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.571753 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:46:57 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:46:57 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:46:57 crc kubenswrapper[4669]: healthz check failed Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.571791 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.571952 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" event={"ID":"b0dc8a08-0e25-4dba-9e96-089c46deb679","Type":"ContainerStarted","Data":"b7e4a65ad433ca824804555bf6f7f7cf8bfb622864a2570966da57a1f86463e7"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.579207 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sgf44" podStartSLOduration=6.579189693 podStartE2EDuration="6.579189693s" podCreationTimestamp="2025-10-08 20:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.579148742 +0000 UTC m=+137.271959415" watchObservedRunningTime="2025-10-08 20:46:57.579189693 +0000 UTC m=+137.272000356" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.585507 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" event={"ID":"c6254786-5db9-423f-88ed-f42edefc70a8","Type":"ContainerStarted","Data":"494a6e01fbabefabcac5cf53c2f3ae83dce968cdf8b778a12073209bfd36d879"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.589648 4669 generic.go:334] "Generic (PLEG): container finished" podID="96a22bbb-2450-42e2-8c27-8246a89fbb6e" containerID="9f3b48663d41f37f2b3b6bb9841fd66d1906420ee029a38850ee883d3db05200" exitCode=0 Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.589723 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" event={"ID":"96a22bbb-2450-42e2-8c27-8246a89fbb6e","Type":"ContainerDied","Data":"9f3b48663d41f37f2b3b6bb9841fd66d1906420ee029a38850ee883d3db05200"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.594068 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9fcxn" event={"ID":"fabcc99d-46ca-4172-918e-b5038c4a001a","Type":"ContainerStarted","Data":"d42ab81920d43c5a8c7a0863092121e6b5bd85e50b64713baeaaf77e21e78c6d"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.599382 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" podStartSLOduration=114.599366786 podStartE2EDuration="1m54.599366786s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.596605851 +0000 UTC m=+137.289416524" watchObservedRunningTime="2025-10-08 20:46:57.599366786 +0000 UTC m=+137.292177459" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.605833 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" event={"ID":"9bcf6b7f-1936-4261-a1ee-907951cb68f3","Type":"ContainerStarted","Data":"afb974411d8becdcf23f748d3e1d49c1e8363d90a1490e8747e0982429f1c311"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.606847 4669 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-449s9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.606946 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" podUID="9bcf6b7f-1936-4261-a1ee-907951cb68f3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.614023 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" event={"ID":"6364f6d6-6f56-43a4-af6c-9417865f326e","Type":"ContainerStarted","Data":"bbe7b91ee129b9e67f7da752c3654f75ad18b4a1126ed0c06d83b8b3f975070e"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.617124 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" podStartSLOduration=115.61710559 podStartE2EDuration="1m55.61710559s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.614451408 +0000 UTC m=+137.307262081" watchObservedRunningTime="2025-10-08 20:46:57.61710559 +0000 UTC m=+137.309916263" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.618879 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" event={"ID":"e2fa8fe7-d057-4386-9121-f4c838bd1a76","Type":"ContainerStarted","Data":"ad5bb7c6731a11632a3721d1b184fb4b9def0d4c74a4587f8db9ec1a96599405"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.618932 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" event={"ID":"e2fa8fe7-d057-4386-9121-f4c838bd1a76","Type":"ContainerStarted","Data":"87387f3c82818c8773d2249515706df5357c3d2c480dd5f20ca3b66df7175d01"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.626126 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" event={"ID":"9f1ba963-c886-46de-9428-a18cc4b13eeb","Type":"ContainerStarted","Data":"b33b5732c88938b6f2d88ada7098074b1ae4a4ff7dcaef4a833d917d3902565e"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.626165 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" event={"ID":"9f1ba963-c886-46de-9428-a18cc4b13eeb","Type":"ContainerStarted","Data":"77d20aaadffcaa003344123cea343523d8dfa087ecb20b6457ce4c0c5c755bc0"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.630884 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r2ncb" podStartSLOduration=115.630869672 podStartE2EDuration="1m55.630869672s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.629384626 +0000 UTC m=+137.322195299" watchObservedRunningTime="2025-10-08 20:46:57.630869672 +0000 UTC m=+137.323680345" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.641820 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" event={"ID":"ee97e8a2-071a-48f9-a13f-d66fb60bbb55","Type":"ContainerStarted","Data":"8f15aab9710aff11db0b9963afe45aed3aaff18e2bd3972f282fe088cfad099a"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.643026 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.643158 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.143126838 +0000 UTC m=+137.835937521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.643622 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.643634 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" event={"ID":"dca95b5e-6308-41f2-a01d-9cb5c15b8607","Type":"ContainerStarted","Data":"07cbf75ac81c05abf74146b43317bbecf93f2956c5201c76729e50e4e83d1b10"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.643658 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" event={"ID":"dca95b5e-6308-41f2-a01d-9cb5c15b8607","Type":"ContainerStarted","Data":"f0bfbd878991bfceb77620f96803707b0b90cc8ee36700503e123e406a1bd14c"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.644229 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.645607 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.145592905 +0000 UTC m=+137.838403578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.654759 4669 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dsg9n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.654804 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" podUID="dca95b5e-6308-41f2-a01d-9cb5c15b8607" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.662795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" event={"ID":"4bbf025a-0f57-480a-80ec-4e21b1ef4e69","Type":"ContainerStarted","Data":"19a4841d1fe8b8eb2644d906541dd84e1638edfdca060c05de47c7fe6942771e"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.663580 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.665737 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" event={"ID":"e8b346cb-e447-4473-a94d-a66882c3af6f","Type":"ContainerStarted","Data":"d5949bc84aa39869a346291460dd8d571ec60f369a8f2870457e2be6d1716e14"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.674392 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" event={"ID":"9b77bb83-431f-4431-93cc-04be87ab96dc","Type":"ContainerStarted","Data":"c21345439a28dc761eb07bd4cb14d1905cd10f9a270d8fcf375b05c347c75189"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.688054 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4nbtf" podStartSLOduration=115.688034567 podStartE2EDuration="1m55.688034567s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.654568345 +0000 UTC m=+137.347379028" watchObservedRunningTime="2025-10-08 20:46:57.688034567 +0000 UTC m=+137.380845240" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.692787 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" event={"ID":"533e765f-918a-456e-8803-dd3e8495f20f","Type":"ContainerStarted","Data":"56a61034388f41b72a277d2e9b722bb03d963aa5b6abd1280e222401f8bfcddd"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.706606 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" event={"ID":"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe","Type":"ContainerStarted","Data":"1672a51d12b5dc977486af2465ec6ce43259309301ef95fcc5f579cc5ab4ac8e"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.706656 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" event={"ID":"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe","Type":"ContainerStarted","Data":"565ac820321df6837a9ad9afdbebb752f36f36fc154228ae66337d6606e475a5"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.718803 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tsh5h" podStartSLOduration=114.718785266 podStartE2EDuration="1m54.718785266s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.687634588 +0000 UTC m=+137.380445261" watchObservedRunningTime="2025-10-08 20:46:57.718785266 +0000 UTC m=+137.411595939" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.719879 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" podStartSLOduration=114.719874152 podStartE2EDuration="1m54.719874152s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.717433584 +0000 UTC m=+137.410244257" watchObservedRunningTime="2025-10-08 20:46:57.719874152 +0000 UTC m=+137.412684825" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.732854 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" event={"ID":"b5e11824-a189-4e46-9548-641ef4dbe7fe","Type":"ContainerStarted","Data":"efbe2ed7b2c6ef9f1b4f5d640ed2332db4cb31c9a4f5875c3c91a0ba369f4b40"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.732928 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" event={"ID":"b5e11824-a189-4e46-9548-641ef4dbe7fe","Type":"ContainerStarted","Data":"4bfc310abec3ab41569d56e0b6ee17decf9f526a3bee63aed61490ff52efb5df"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.745692 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.747196 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.247177319 +0000 UTC m=+137.939988002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.766355 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" event={"ID":"f80324fe-a4f1-44ba-a946-acade1dcb4e0","Type":"ContainerStarted","Data":"93f66697738e1cb033209e3fd72093a03c06d7ebab3fb5d7b8d21b7843da8f0b"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.799405 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" event={"ID":"d62a0f2d-49ef-4291-a14d-940a6215347c","Type":"ContainerStarted","Data":"e28fdc831756ec43a601e667318312d2422ebccfcd787df983633db434252f61"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.799475 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" event={"ID":"d62a0f2d-49ef-4291-a14d-940a6215347c","Type":"ContainerStarted","Data":"3e476465ae9633b78ee1b1cc9319e8e3ac1e9cb149e80974c78e7f8a51f3eb67"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.799931 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.806660 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sv8nt" event={"ID":"7a558b91-227c-4e11-aa99-4406e545a2ea","Type":"ContainerStarted","Data":"538d57458fd527b5a9e19c2e05c1fc414b7ea4f8e467ed18779c9ac9c2cfa89d"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.825355 4669 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r6qfk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.825407 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" podUID="d62a0f2d-49ef-4291-a14d-940a6215347c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.827941 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" event={"ID":"86c26994-b813-4c6e-af79-4c35425bc260","Type":"ContainerStarted","Data":"a86cebef31a55646aef863f2d7be86ebf3e5a1895a0d30654196ab47b56a4664"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.841281 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lgt96" podStartSLOduration=114.841256248 podStartE2EDuration="1m54.841256248s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.825919489 +0000 UTC m=+137.518730162" watchObservedRunningTime="2025-10-08 20:46:57.841256248 +0000 UTC m=+137.534066921" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.848062 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wg8jg" podStartSLOduration=114.848045406 podStartE2EDuration="1m54.848045406s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.84477734 +0000 UTC m=+137.537588033" watchObservedRunningTime="2025-10-08 20:46:57.848045406 +0000 UTC m=+137.540856079" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.848675 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.850843 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.350826981 +0000 UTC m=+138.043637664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.870265 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" podStartSLOduration=114.870244025 podStartE2EDuration="1m54.870244025s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.867698065 +0000 UTC m=+137.560508738" watchObservedRunningTime="2025-10-08 20:46:57.870244025 +0000 UTC m=+137.563054698" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.874060 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" event={"ID":"d0da4600-abf4-4f3e-8299-a269b29ca44a","Type":"ContainerStarted","Data":"83e0a14abb4cb3c3a0d25417141fa74c15717cccbf1da0957dae9c6e63bd94b6"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.874108 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" event={"ID":"d0da4600-abf4-4f3e-8299-a269b29ca44a","Type":"ContainerStarted","Data":"20edbb19518383def530de391a6b2455a2a68270d88f9ede686820b80d91df35"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.880027 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" event={"ID":"5dd3b6ce-7492-4273-8d1b-86dc10a03404","Type":"ContainerStarted","Data":"3b6d34018b850a1648ad63ba2564c309f66b3316e0d3cca2a1939a7d72abdd1a"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.880071 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" event={"ID":"5dd3b6ce-7492-4273-8d1b-86dc10a03404","Type":"ContainerStarted","Data":"306543e66b24f703d993f07fba4deb04952c97110a3c159e0742b5f27ceeeb7f"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.891064 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwdg7" podStartSLOduration=114.891044992 podStartE2EDuration="1m54.891044992s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.89016582 +0000 UTC m=+137.582976513" watchObservedRunningTime="2025-10-08 20:46:57.891044992 +0000 UTC m=+137.583855665" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.893460 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" event={"ID":"efd51828-51ba-4723-bd28-306e48ce1a54","Type":"ContainerStarted","Data":"3c9da7f72dbbcca2e3659bf7861e546a397b3ea9a551ba6ddcee3b06ad98fffd"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.895861 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kc49q" event={"ID":"318f04b7-d7a9-4e8f-9d0f-03699e258019","Type":"ContainerStarted","Data":"ed14d9ebb237e76baefa81a20a10c64efc6b140b9fba5605c116257092e46f44"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.906091 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" event={"ID":"0858a203-42e8-4108-a0bd-48ba190bf420","Type":"ContainerStarted","Data":"7560c91b54a48a9683aa74416d42255d4fcf428ec3d56f18577233efcf41e0cb"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.906561 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.907665 4669 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dtqw9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.907698 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.910686 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" event={"ID":"e45ff91e-f07d-489b-b0a0-8e815bdf41c3","Type":"ContainerStarted","Data":"45c1813a46021ee4a861f70f8f4daa88042596f4806a62be491cfe9ce82a54d3"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.910733 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" event={"ID":"e45ff91e-f07d-489b-b0a0-8e815bdf41c3","Type":"ContainerStarted","Data":"d1602d2644d8318603af977eecdf6c8e147968689f053db2d4b9643a65799526"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.914911 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" podStartSLOduration=114.914895458 podStartE2EDuration="1m54.914895458s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.9119775 +0000 UTC m=+137.604788183" watchObservedRunningTime="2025-10-08 20:46:57.914895458 +0000 UTC m=+137.607706131" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.915424 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" event={"ID":"3ab0571a-8beb-446a-9f96-b9bc226d5b22","Type":"ContainerStarted","Data":"35704300eb809b0a940e20a30a0ad203b9a955814cdc3fc0b145f37165eb5381"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.924133 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" event={"ID":"21147ccd-e3af-44e0-a7fd-2931e731cc53","Type":"ContainerStarted","Data":"a5b69caaa01dbd4084595fd2cfcf5ca91d90b0f058c721515e5226bc7997356c"} Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.924585 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-z99h8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.924616 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z99h8" podUID="3b104989-912e-43b1-a295-2ea8eb157e77" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.937385 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" podStartSLOduration=114.937365684 podStartE2EDuration="1m54.937365684s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.936875922 +0000 UTC m=+137.629686615" watchObservedRunningTime="2025-10-08 20:46:57.937365684 +0000 UTC m=+137.630176357" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.950571 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:57 crc kubenswrapper[4669]: E1008 20:46:57.953798 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.453753517 +0000 UTC m=+138.146564190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.967340 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xjlqv" podStartSLOduration=115.967319704 podStartE2EDuration="1m55.967319704s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.957571976 +0000 UTC m=+137.650382649" watchObservedRunningTime="2025-10-08 20:46:57.967319704 +0000 UTC m=+137.660130387" Oct 08 20:46:57 crc kubenswrapper[4669]: I1008 20:46:57.988656 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-rtm6r" podStartSLOduration=114.988635422 podStartE2EDuration="1m54.988635422s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:57.984240339 +0000 UTC m=+137.677051012" watchObservedRunningTime="2025-10-08 20:46:57.988635422 +0000 UTC m=+137.681446095" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.027689 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7b2xv" podStartSLOduration=115.027666424 podStartE2EDuration="1m55.027666424s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.00564452 +0000 UTC m=+137.698455193" watchObservedRunningTime="2025-10-08 20:46:58.027666424 +0000 UTC m=+137.720477097" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.054987 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9vmlr" podStartSLOduration=115.054953081 podStartE2EDuration="1m55.054953081s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.028730409 +0000 UTC m=+137.721541082" watchObservedRunningTime="2025-10-08 20:46:58.054953081 +0000 UTC m=+137.747763774" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.057104 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" podStartSLOduration=116.057088631 podStartE2EDuration="1m56.057088631s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.052569495 +0000 UTC m=+137.745380168" watchObservedRunningTime="2025-10-08 20:46:58.057088631 +0000 UTC m=+137.749899334" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.055331 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.055622 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.555607546 +0000 UTC m=+138.248418219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.072773 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ps5hq" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.091870 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" podStartSLOduration=115.091854634 podStartE2EDuration="1m55.091854634s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.090354629 +0000 UTC m=+137.783165302" watchObservedRunningTime="2025-10-08 20:46:58.091854634 +0000 UTC m=+137.784665307" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.092015 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kc49q" podStartSLOduration=7.092010037 podStartE2EDuration="7.092010037s" podCreationTimestamp="2025-10-08 20:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.072421799 +0000 UTC m=+137.765232472" watchObservedRunningTime="2025-10-08 20:46:58.092010037 +0000 UTC m=+137.784820710" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.112562 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" podStartSLOduration=115.112517827 podStartE2EDuration="1m55.112517827s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.111678117 +0000 UTC m=+137.804488800" watchObservedRunningTime="2025-10-08 20:46:58.112517827 +0000 UTC m=+137.805328500" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.159490 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.159916 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.659902384 +0000 UTC m=+138.352713057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.169986 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" podStartSLOduration=115.169965969 podStartE2EDuration="1m55.169965969s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.154721093 +0000 UTC m=+137.847531756" watchObservedRunningTime="2025-10-08 20:46:58.169965969 +0000 UTC m=+137.862776632" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.200911 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wn4pf" podStartSLOduration=115.200894452 podStartE2EDuration="1m55.200894452s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.172820066 +0000 UTC m=+137.865630739" watchObservedRunningTime="2025-10-08 20:46:58.200894452 +0000 UTC m=+137.893705125" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.262425 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.262886 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.76287078 +0000 UTC m=+138.455681453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.364029 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.364209 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.864177268 +0000 UTC m=+138.556987941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.364376 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.364760 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.864752661 +0000 UTC m=+138.557563334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.465709 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.466110 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:58.966094939 +0000 UTC m=+138.658905612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.567204 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.567674 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.067657722 +0000 UTC m=+138.760468395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.569849 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:46:58 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:46:58 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:46:58 crc kubenswrapper[4669]: healthz check failed Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.569895 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.668737 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.668894 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.168865747 +0000 UTC m=+138.861676420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.669154 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.669653 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.169630976 +0000 UTC m=+138.862441649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.770332 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.770585 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.270559185 +0000 UTC m=+138.963369858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.770717 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.771064 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.271055876 +0000 UTC m=+138.963866549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.871953 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.872192 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.372164629 +0000 UTC m=+139.064975302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.872251 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.872579 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.372571308 +0000 UTC m=+139.065381981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.928844 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" event={"ID":"4bbf025a-0f57-480a-80ec-4e21b1ef4e69","Type":"ContainerStarted","Data":"87d92a423839be7a219b85c7fcf23eaed096ab62856f7771868c7e3aa9c38595"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.931522 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" event={"ID":"e8b346cb-e447-4473-a94d-a66882c3af6f","Type":"ContainerStarted","Data":"22526ef62c35ecc56f51d04daaec2b2486a3ce087912ae19d5302c97138651ab"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.933511 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" event={"ID":"96a22bbb-2450-42e2-8c27-8246a89fbb6e","Type":"ContainerStarted","Data":"11f30f13480e2d6a0f2319966431e1d977320b676745e34754137956d412a4bf"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.933665 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.935207 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-67zzr" event={"ID":"21147ccd-e3af-44e0-a7fd-2931e731cc53","Type":"ContainerStarted","Data":"2f8a7edb326412517b2867cf2575e18af99620703e088511446b62c883b665be"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.936590 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" event={"ID":"f80324fe-a4f1-44ba-a946-acade1dcb4e0","Type":"ContainerStarted","Data":"4a0b9ffebbf0ef9a9bb87048277d712badf3fe7e5cd6557c9c29b024b5791476"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.936635 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" event={"ID":"f80324fe-a4f1-44ba-a946-acade1dcb4e0","Type":"ContainerStarted","Data":"a7e4e8212523aca4e051fcceb90ca1488e7c0eab3ba065432353aea84015ad22"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.937961 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r7zkf" event={"ID":"5dd3b6ce-7492-4273-8d1b-86dc10a03404","Type":"ContainerStarted","Data":"b21d96ab4f5969f49bb2c2d7e7e706b81aede89dd7e1b152fe4f8b01a941cba3"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.939219 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" event={"ID":"c6254786-5db9-423f-88ed-f42edefc70a8","Type":"ContainerStarted","Data":"17cd4c14b9005a0d24604b87c9a91383ce991cec8bb3e889ef358e718a29866b"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.940441 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" event={"ID":"9f1ba963-c886-46de-9428-a18cc4b13eeb","Type":"ContainerStarted","Data":"8c18f59542331ad6b898740dc5a3731d0bae23e5770f6270cd8ef97e2bd3c131"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.942184 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" event={"ID":"6364f6d6-6f56-43a4-af6c-9417865f326e","Type":"ContainerStarted","Data":"4940247f13b03fe7f90633fc94c940f4f1415a6361bb0b958773a4a403ad5f1a"} Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952452 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4knb8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952487 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-z99h8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952507 4669 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dsg9n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952522 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952562 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z99h8" podUID="3b104989-912e-43b1-a295-2ea8eb157e77" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952572 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" podUID="dca95b5e-6308-41f2-a01d-9cb5c15b8607" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952457 4669 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dtqw9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952626 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952643 4669 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r6qfk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.952657 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" podUID="d62a0f2d-49ef-4291-a14d-940a6215347c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.957864 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" podStartSLOduration=116.957839931 podStartE2EDuration="1m56.957839931s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:58.955191449 +0000 UTC m=+138.648002132" watchObservedRunningTime="2025-10-08 20:46:58.957839931 +0000 UTC m=+138.650650604" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.967472 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.972838 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:58 crc kubenswrapper[4669]: E1008 20:46:58.974171 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.474151301 +0000 UTC m=+139.166961974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:58 crc kubenswrapper[4669]: I1008 20:46:58.987373 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-449s9" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.039119 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8wcf" podStartSLOduration=117.03910098 podStartE2EDuration="1m57.03910098s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:59.037655856 +0000 UTC m=+138.730466549" watchObservedRunningTime="2025-10-08 20:46:59.03910098 +0000 UTC m=+138.731911653" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.039562 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" podStartSLOduration=117.039556331 podStartE2EDuration="1m57.039556331s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:59.015006337 +0000 UTC m=+138.707817010" watchObservedRunningTime="2025-10-08 20:46:59.039556331 +0000 UTC m=+138.732367004" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.080517 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.080966 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.580949067 +0000 UTC m=+139.273759740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.096497 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dtnh6" podStartSLOduration=116.09647744 podStartE2EDuration="1m56.09647744s" podCreationTimestamp="2025-10-08 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:59.064550915 +0000 UTC m=+138.757361588" watchObservedRunningTime="2025-10-08 20:46:59.09647744 +0000 UTC m=+138.789288113" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.099095 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xndc2" podStartSLOduration=117.099082711 podStartE2EDuration="1m57.099082711s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:46:59.095056678 +0000 UTC m=+138.787867351" watchObservedRunningTime="2025-10-08 20:46:59.099082711 +0000 UTC m=+138.791893384" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.165203 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.165554 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.167257 4669 patch_prober.go:28] interesting pod/apiserver-76f77b778f-25b8x container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.29:8443/livez\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.167294 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" podUID="e8b346cb-e447-4473-a94d-a66882c3af6f" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.29:8443/livez\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.181485 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.181973 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.681956288 +0000 UTC m=+139.374766961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.282801 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.283271 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.783258095 +0000 UTC m=+139.476068768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.384082 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.384470 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.88445391 +0000 UTC m=+139.577264583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.498060 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.498739 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:46:59.998499336 +0000 UTC m=+139.691310009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.554688 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.554766 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.568880 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:46:59 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:46:59 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:46:59 crc kubenswrapper[4669]: healthz check failed Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.568960 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.599250 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.599849 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.099830983 +0000 UTC m=+139.792641656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.700650 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.701177 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.201155191 +0000 UTC m=+139.893965944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.801745 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.802137 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.30212127 +0000 UTC m=+139.994931933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.903485 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:46:59 crc kubenswrapper[4669]: E1008 20:46:59.903799 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.403786586 +0000 UTC m=+140.096597259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.949170 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4knb8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.949224 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.953428 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:46:59 crc kubenswrapper[4669]: I1008 20:46:59.967310 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dsg9n" Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.004810 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.004983 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.50495642 +0000 UTC m=+140.197767093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.107537 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.108083 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.60807196 +0000 UTC m=+140.300882633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.208418 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.208710 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.708694541 +0000 UTC m=+140.401505214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.309806 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.310171 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.810158512 +0000 UTC m=+140.502969185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.410681 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.410934 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.910908056 +0000 UTC m=+140.603718729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.411123 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.411548 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:00.91151771 +0000 UTC m=+140.604328393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.512270 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.512378 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.012359217 +0000 UTC m=+140.705169890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.512628 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.512914 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.01290702 +0000 UTC m=+140.705717693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.567598 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:47:00 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:47:00 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:47:00 crc kubenswrapper[4669]: healthz check failed Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.567689 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.614557 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.614767 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.11473653 +0000 UTC m=+140.807547203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.615021 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.615374 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.115354834 +0000 UTC m=+140.808165567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.716015 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.716217 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.21618325 +0000 UTC m=+140.908993923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.716281 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.716598 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.216586709 +0000 UTC m=+140.909397382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.740606 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.817393 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.817637 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.31761032 +0000 UTC m=+141.010420993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.817749 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.818128 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.318113073 +0000 UTC m=+141.010923806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.919520 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.919676 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.419646395 +0000 UTC m=+141.112457068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.919800 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:00 crc kubenswrapper[4669]: E1008 20:47:00.920122 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.420115156 +0000 UTC m=+141.112925829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:00 crc kubenswrapper[4669]: I1008 20:47:00.959388 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mv26p" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.020796 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.021072 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.521019884 +0000 UTC m=+141.213830567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.021857 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.022224 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.522199461 +0000 UTC m=+141.215010264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.094317 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.095729 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.120012 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.120503 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.123185 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.123568 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.62355227 +0000 UTC m=+141.316362943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.123869 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.189054 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8cmp8"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.190005 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.194413 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.206661 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cmp8"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.225522 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.225824 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.225847 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.226212 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.726200468 +0000 UTC m=+141.419011141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.316103 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s7xfl"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.317269 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.319503 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.326596 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.326885 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j9fx\" (UniqueName: \"kubernetes.io/projected/2d3175d3-ec70-498c-a243-dc5ab9b1efac-kube-api-access-7j9fx\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.326919 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-catalog-content\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.326991 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.826960543 +0000 UTC m=+141.519771206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.327035 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.327096 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-utilities\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.327130 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.327129 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.327149 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.327504 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.827491955 +0000 UTC m=+141.520302628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.336095 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7xfl"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.411843 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.428368 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.428614 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j9fx\" (UniqueName: \"kubernetes.io/projected/2d3175d3-ec70-498c-a243-dc5ab9b1efac-kube-api-access-7j9fx\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.428643 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-catalog-content\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.428676 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.92864173 +0000 UTC m=+141.621452403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.428735 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-utilities\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.428794 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.428937 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-catalog-content\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.429036 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-utilities\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.429088 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p97g9\" (UniqueName: \"kubernetes.io/projected/17f131f8-064e-407f-affb-af300e3a5867-kube-api-access-p97g9\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.429118 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-utilities\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.429254 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:01.929237733 +0000 UTC m=+141.622048396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.429421 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-catalog-content\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.430406 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.475657 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j9fx\" (UniqueName: \"kubernetes.io/projected/2d3175d3-ec70-498c-a243-dc5ab9b1efac-kube-api-access-7j9fx\") pod \"community-operators-8cmp8\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.502326 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.519881 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4zv6v"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.520806 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.537627 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.537759 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.037732998 +0000 UTC m=+141.730543671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.537870 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.537932 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-catalog-content\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.537967 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-utilities\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.537998 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p97g9\" (UniqueName: \"kubernetes.io/projected/17f131f8-064e-407f-affb-af300e3a5867-kube-api-access-p97g9\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.538396 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-catalog-content\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.538636 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-utilities\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.538772 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.038760713 +0000 UTC m=+141.731571476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.551052 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zv6v"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.571565 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74lcm" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.572793 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:47:01 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:47:01 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:47:01 crc kubenswrapper[4669]: healthz check failed Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.572819 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.577697 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97g9\" (UniqueName: \"kubernetes.io/projected/17f131f8-064e-407f-affb-af300e3a5867-kube-api-access-p97g9\") pod \"certified-operators-s7xfl\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.631800 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.641967 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.642419 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-utilities\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.642499 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkwj\" (UniqueName: \"kubernetes.io/projected/e6c956ef-8cee-4729-aa58-d053415e8d1c-kube-api-access-hpkwj\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.642641 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-catalog-content\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.642847 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.142833075 +0000 UTC m=+141.835643748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.736056 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t7lxk"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.737187 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.745884 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-utilities\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.745910 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkwj\" (UniqueName: \"kubernetes.io/projected/e6c956ef-8cee-4729-aa58-d053415e8d1c-kube-api-access-hpkwj\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.745943 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.745976 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-catalog-content\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.746176 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7lxk"] Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.746404 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-catalog-content\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.746701 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.246688221 +0000 UTC m=+141.939498894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.746871 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-utilities\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.784141 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkwj\" (UniqueName: \"kubernetes.io/projected/e6c956ef-8cee-4729-aa58-d053415e8d1c-kube-api-access-hpkwj\") pod \"community-operators-4zv6v\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.853467 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.853742 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-utilities\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.853776 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k8rq\" (UniqueName: \"kubernetes.io/projected/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-kube-api-access-9k8rq\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.853863 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-catalog-content\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.853948 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.353933557 +0000 UTC m=+142.046744230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.856099 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.954638 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k8rq\" (UniqueName: \"kubernetes.io/projected/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-kube-api-access-9k8rq\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.954944 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.954970 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-catalog-content\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.955004 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-utilities\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.955374 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-utilities\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: E1008 20:47:01.955586 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.455569503 +0000 UTC m=+142.148380176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.955638 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-catalog-content\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.979050 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" event={"ID":"6364f6d6-6f56-43a4-af6c-9417865f326e","Type":"ContainerStarted","Data":"4951276360a8914456381fa6e6e12e7f340ff68e8409335c20fa1f51e1412fac"} Oct 08 20:47:01 crc kubenswrapper[4669]: I1008 20:47:01.985506 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k8rq\" (UniqueName: \"kubernetes.io/projected/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-kube-api-access-9k8rq\") pod \"certified-operators-t7lxk\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.019351 4669 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.058080 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:02 crc kubenswrapper[4669]: E1008 20:47:02.059083 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.559067292 +0000 UTC m=+142.251877965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.076003 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.135725 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cmp8"] Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.139210 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 20:47:02 crc kubenswrapper[4669]: W1008 20:47:02.153699 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d3175d3_ec70_498c_a243_dc5ab9b1efac.slice/crio-a24215c3fcdcc35ac363d3057f54533a9fd3216d37c65d71f8d4cb09bb3fe1e4 WatchSource:0}: Error finding container a24215c3fcdcc35ac363d3057f54533a9fd3216d37c65d71f8d4cb09bb3fe1e4: Status 404 returned error can't find the container with id a24215c3fcdcc35ac363d3057f54533a9fd3216d37c65d71f8d4cb09bb3fe1e4 Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.175591 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:02 crc kubenswrapper[4669]: E1008 20:47:02.175920 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.675908901 +0000 UTC m=+142.368719564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.240973 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7xfl"] Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.276872 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:02 crc kubenswrapper[4669]: E1008 20:47:02.277166 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.777150427 +0000 UTC m=+142.469961100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.379403 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:02 crc kubenswrapper[4669]: E1008 20:47:02.380063 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.880049022 +0000 UTC m=+142.572859695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm28n" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.412829 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zv6v"] Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.481080 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:02 crc kubenswrapper[4669]: E1008 20:47:02.481491 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 20:47:02.981472062 +0000 UTC m=+142.674282735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.489621 4669 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-08T20:47:02.019368033Z","Handler":null,"Name":""} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.490563 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7lxk"] Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.492492 4669 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.492543 4669 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.580611 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:47:02 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:47:02 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:47:02 crc kubenswrapper[4669]: healthz check failed Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.580927 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.582548 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.589136 4669 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.589174 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.631435 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm28n\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.683140 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.692817 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.933271 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.985981 4669 generic.go:334] "Generic (PLEG): container finished" podID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerID="9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542" exitCode=0 Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.986587 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zv6v" event={"ID":"e6c956ef-8cee-4729-aa58-d053415e8d1c","Type":"ContainerDied","Data":"9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.987000 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zv6v" event={"ID":"e6c956ef-8cee-4729-aa58-d053415e8d1c","Type":"ContainerStarted","Data":"ccf0e6a9bfa12d3804d6517b4af9d03df21aa510167a52987ad72c36141b6c5d"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.988561 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.992296 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" event={"ID":"6364f6d6-6f56-43a4-af6c-9417865f326e","Type":"ContainerStarted","Data":"b4a0a7891e125f7a4383e91d71a4e590067c0f94ed348ec783f553c20a332fa8"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.992335 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" event={"ID":"6364f6d6-6f56-43a4-af6c-9417865f326e","Type":"ContainerStarted","Data":"da8dc675dfa1ebe61e57300b3f3cfa126b28aba2d15b6854b7c4de6602afdb86"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.996190 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4f39fae-0b71-4705-ba1d-712eff8ff21f","Type":"ContainerStarted","Data":"ec263d24e052b037be578e9d18c93c150cfdb2b5527ea0b393924a0d76f54589"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.996258 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4f39fae-0b71-4705-ba1d-712eff8ff21f","Type":"ContainerStarted","Data":"c3c7c0f8e74241457368a8438fb0975d2dcbc6159e9ad3574ce932c0bdb0ce06"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.997849 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerID="b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b" exitCode=0 Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.997967 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7lxk" event={"ID":"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7","Type":"ContainerDied","Data":"b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b"} Oct 08 20:47:02 crc kubenswrapper[4669]: I1008 20:47:02.998043 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7lxk" event={"ID":"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7","Type":"ContainerStarted","Data":"1e41deff895cc5d481a34e7e17bc20101ffc56d3652271b8659f79e74f8505a0"} Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.015782 4669 generic.go:334] "Generic (PLEG): container finished" podID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerID="7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25" exitCode=0 Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.015874 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cmp8" event={"ID":"2d3175d3-ec70-498c-a243-dc5ab9b1efac","Type":"ContainerDied","Data":"7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25"} Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.015906 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cmp8" event={"ID":"2d3175d3-ec70-498c-a243-dc5ab9b1efac","Type":"ContainerStarted","Data":"a24215c3fcdcc35ac363d3057f54533a9fd3216d37c65d71f8d4cb09bb3fe1e4"} Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.027573 4669 generic.go:334] "Generic (PLEG): container finished" podID="17f131f8-064e-407f-affb-af300e3a5867" containerID="4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5" exitCode=0 Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.027625 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7xfl" event={"ID":"17f131f8-064e-407f-affb-af300e3a5867","Type":"ContainerDied","Data":"4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5"} Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.027683 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7xfl" event={"ID":"17f131f8-064e-407f-affb-af300e3a5867","Type":"ContainerStarted","Data":"42fbabe0be401b59607ab124f8c49f0ec5db852c10f8f7468bae57edb9e15a93"} Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.034830 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.034795893 podStartE2EDuration="2.034795893s" podCreationTimestamp="2025-10-08 20:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:47:03.023431937 +0000 UTC m=+142.716242630" watchObservedRunningTime="2025-10-08 20:47:03.034795893 +0000 UTC m=+142.727606606" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.055912 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tqhcx" podStartSLOduration=12.055887915 podStartE2EDuration="12.055887915s" podCreationTimestamp="2025-10-08 20:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:47:03.051403271 +0000 UTC m=+142.744213944" watchObservedRunningTime="2025-10-08 20:47:03.055887915 +0000 UTC m=+142.748698588" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.208709 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm28n"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.247354 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.248059 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.252410 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.252687 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.258253 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.312639 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zfsm7"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.313567 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.316824 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.340404 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.341313 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfsm7"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.397739 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-catalog-content\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.398116 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmb9t\" (UniqueName: \"kubernetes.io/projected/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-kube-api-access-lmb9t\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.398179 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-utilities\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.398275 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58d32a06-7489-43f3-8d30-8e47d19da27e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.398374 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58d32a06-7489-43f3-8d30-8e47d19da27e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.499651 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58d32a06-7489-43f3-8d30-8e47d19da27e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.499728 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-catalog-content\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.499753 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmb9t\" (UniqueName: \"kubernetes.io/projected/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-kube-api-access-lmb9t\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.499796 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-utilities\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.499839 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58d32a06-7489-43f3-8d30-8e47d19da27e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.499940 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58d32a06-7489-43f3-8d30-8e47d19da27e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.500447 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-catalog-content\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.500588 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-utilities\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.519222 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58d32a06-7489-43f3-8d30-8e47d19da27e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.519352 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmb9t\" (UniqueName: \"kubernetes.io/projected/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-kube-api-access-lmb9t\") pod \"redhat-marketplace-zfsm7\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.568425 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:47:03 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:47:03 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:47:03 crc kubenswrapper[4669]: healthz check failed Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.568491 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.586340 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.633855 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.710952 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nl8"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.712943 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.723227 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nl8"] Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.803341 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxh64\" (UniqueName: \"kubernetes.io/projected/ad4dc180-c88e-4650-8676-0a65909d8abb-kube-api-access-dxh64\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.803432 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-catalog-content\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.803483 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-utilities\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.862575 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfsm7"] Oct 08 20:47:03 crc kubenswrapper[4669]: W1008 20:47:03.868304 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e7ead4_34f5_4b4a_8a18_bde6bd3cd62d.slice/crio-6a7162526074761d3fb59e8d1fb9f7f6f71eef715ee393fb3f1c446ae1a0f262 WatchSource:0}: Error finding container 6a7162526074761d3fb59e8d1fb9f7f6f71eef715ee393fb3f1c446ae1a0f262: Status 404 returned error can't find the container with id 6a7162526074761d3fb59e8d1fb9f7f6f71eef715ee393fb3f1c446ae1a0f262 Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.905273 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxh64\" (UniqueName: \"kubernetes.io/projected/ad4dc180-c88e-4650-8676-0a65909d8abb-kube-api-access-dxh64\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.905373 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-catalog-content\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.905438 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-utilities\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.906050 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-catalog-content\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.906116 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-utilities\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:03 crc kubenswrapper[4669]: I1008 20:47:03.927001 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxh64\" (UniqueName: \"kubernetes.io/projected/ad4dc180-c88e-4650-8676-0a65909d8abb-kube-api-access-dxh64\") pod \"redhat-marketplace-f4nl8\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.032877 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4f39fae-0b71-4705-ba1d-712eff8ff21f" containerID="ec263d24e052b037be578e9d18c93c150cfdb2b5527ea0b393924a0d76f54589" exitCode=0 Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.032944 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4f39fae-0b71-4705-ba1d-712eff8ff21f","Type":"ContainerDied","Data":"ec263d24e052b037be578e9d18c93c150cfdb2b5527ea0b393924a0d76f54589"} Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.034501 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.035929 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" event={"ID":"64c20dfc-09aa-4096-b7c1-7233d0a18a17","Type":"ContainerStarted","Data":"edcbee96cdb7309ae439366725b32d7ccf74d2ac6eef89b7bf260c961d23cac7"} Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.035965 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" event={"ID":"64c20dfc-09aa-4096-b7c1-7233d0a18a17","Type":"ContainerStarted","Data":"2b3ccf8c73936ec642976dc5d9176921a56604297133d350444e27ea971c0c53"} Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.036079 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.037294 4669 generic.go:334] "Generic (PLEG): container finished" podID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerID="b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1" exitCode=0 Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.037362 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfsm7" event={"ID":"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d","Type":"ContainerDied","Data":"b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1"} Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.037390 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfsm7" event={"ID":"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d","Type":"ContainerStarted","Data":"6a7162526074761d3fb59e8d1fb9f7f6f71eef715ee393fb3f1c446ae1a0f262"} Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.043073 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.077790 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" podStartSLOduration=122.077766915 podStartE2EDuration="2m2.077766915s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:47:04.076281511 +0000 UTC m=+143.769092204" watchObservedRunningTime="2025-10-08 20:47:04.077766915 +0000 UTC m=+143.770577598" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.122079 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.125704 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.133448 4669 patch_prober.go:28] interesting pod/console-f9d7485db-4m5k5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.133516 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4m5k5" podUID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.137390 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-z99h8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.137442 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z99h8" podUID="3b104989-912e-43b1-a295-2ea8eb157e77" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.137390 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-z99h8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.137501 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z99h8" podUID="3b104989-912e-43b1-a295-2ea8eb157e77" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.168992 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.174985 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-25b8x" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.318717 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tb2vx"] Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.332729 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.335744 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.336252 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nl8"] Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.345706 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tb2vx"] Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.427775 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-catalog-content\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.428144 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-utilities\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.428231 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-868h6\" (UniqueName: \"kubernetes.io/projected/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-kube-api-access-868h6\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.529426 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-868h6\" (UniqueName: \"kubernetes.io/projected/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-kube-api-access-868h6\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.529576 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-catalog-content\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.529626 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-utilities\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.530062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-catalog-content\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.530115 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-utilities\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.550087 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-868h6\" (UniqueName: \"kubernetes.io/projected/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-kube-api-access-868h6\") pod \"redhat-operators-tb2vx\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.565566 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.575959 4669 patch_prober.go:28] interesting pod/router-default-5444994796-db8rh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 20:47:04 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 08 20:47:04 crc kubenswrapper[4669]: [+]process-running ok Oct 08 20:47:04 crc kubenswrapper[4669]: healthz check failed Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.577007 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-db8rh" podUID="672eeff6-8079-4f9a-a61c-40094de694be" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.661958 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.707884 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-22z5d"] Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.709274 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.718395 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22z5d"] Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.839718 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-catalog-content\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.840033 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vmq5\" (UniqueName: \"kubernetes.io/projected/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-kube-api-access-8vmq5\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.840069 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-utilities\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.926434 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.940814 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-catalog-content\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.940889 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vmq5\" (UniqueName: \"kubernetes.io/projected/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-kube-api-access-8vmq5\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.941329 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-catalog-content\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.940927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-utilities\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.941866 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-utilities\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:04 crc kubenswrapper[4669]: I1008 20:47:04.958924 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vmq5\" (UniqueName: \"kubernetes.io/projected/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-kube-api-access-8vmq5\") pod \"redhat-operators-22z5d\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.033753 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tb2vx"] Oct 08 20:47:05 crc kubenswrapper[4669]: W1008 20:47:05.043430 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1037d4ad_c7a9_4e59_bc6e_9a26d6504430.slice/crio-4d764fbad688b51545884f795f61f0747a16e969d3a37565fbef530e3d0d2f40 WatchSource:0}: Error finding container 4d764fbad688b51545884f795f61f0747a16e969d3a37565fbef530e3d0d2f40: Status 404 returned error can't find the container with id 4d764fbad688b51545884f795f61f0747a16e969d3a37565fbef530e3d0d2f40 Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.051100 4669 generic.go:334] "Generic (PLEG): container finished" podID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerID="fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55" exitCode=0 Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.051319 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nl8" event={"ID":"ad4dc180-c88e-4650-8676-0a65909d8abb","Type":"ContainerDied","Data":"fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55"} Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.051410 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nl8" event={"ID":"ad4dc180-c88e-4650-8676-0a65909d8abb","Type":"ContainerStarted","Data":"029a40de032fb83491d59044945cd6851ad94bc337391c07fa4d8d5d5dbe6942"} Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.054255 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.056909 4669 generic.go:334] "Generic (PLEG): container finished" podID="a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" containerID="1672a51d12b5dc977486af2465ec6ce43259309301ef95fcc5f579cc5ab4ac8e" exitCode=0 Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.056988 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" event={"ID":"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe","Type":"ContainerDied","Data":"1672a51d12b5dc977486af2465ec6ce43259309301ef95fcc5f579cc5ab4ac8e"} Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.067112 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58d32a06-7489-43f3-8d30-8e47d19da27e","Type":"ContainerStarted","Data":"c27f4ae432b98653639932172aa90ad5ec763b38f16167081056e0c9cd2a9c08"} Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.067165 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58d32a06-7489-43f3-8d30-8e47d19da27e","Type":"ContainerStarted","Data":"efe6940f3a8bf93ed0341a3acba8e2b2a030ff3087b1541d6806b66f3020474a"} Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.229460 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r6qfk" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.253177 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.2531517819999998 podStartE2EDuration="2.253151782s" podCreationTimestamp="2025-10-08 20:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:47:05.098824416 +0000 UTC m=+144.791635089" watchObservedRunningTime="2025-10-08 20:47:05.253151782 +0000 UTC m=+144.945962455" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.448037 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22z5d"] Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.471730 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:05 crc kubenswrapper[4669]: W1008 20:47:05.541702 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod747cdaf9_bde5_4baa_8b58_1e1e9e882ddd.slice/crio-71a00f49d33b3edf908f1a0e64c861d8eae3005e2d02a049cf700f93eb35b848 WatchSource:0}: Error finding container 71a00f49d33b3edf908f1a0e64c861d8eae3005e2d02a049cf700f93eb35b848: Status 404 returned error can't find the container with id 71a00f49d33b3edf908f1a0e64c861d8eae3005e2d02a049cf700f93eb35b848 Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.569078 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.571215 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kube-api-access\") pod \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.571281 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kubelet-dir\") pod \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\" (UID: \"f4f39fae-0b71-4705-ba1d-712eff8ff21f\") " Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.571385 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-db8rh" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.571475 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f4f39fae-0b71-4705-ba1d-712eff8ff21f" (UID: "f4f39fae-0b71-4705-ba1d-712eff8ff21f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.571612 4669 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.605788 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f4f39fae-0b71-4705-ba1d-712eff8ff21f" (UID: "f4f39fae-0b71-4705-ba1d-712eff8ff21f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:47:05 crc kubenswrapper[4669]: I1008 20:47:05.673071 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4f39fae-0b71-4705-ba1d-712eff8ff21f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.110397 4669 generic.go:334] "Generic (PLEG): container finished" podID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerID="38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645" exitCode=0 Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.110571 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2vx" event={"ID":"1037d4ad-c7a9-4e59-bc6e-9a26d6504430","Type":"ContainerDied","Data":"38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645"} Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.110628 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2vx" event={"ID":"1037d4ad-c7a9-4e59-bc6e-9a26d6504430","Type":"ContainerStarted","Data":"4d764fbad688b51545884f795f61f0747a16e969d3a37565fbef530e3d0d2f40"} Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.114184 4669 generic.go:334] "Generic (PLEG): container finished" podID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerID="8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b" exitCode=0 Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.114353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22z5d" event={"ID":"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd","Type":"ContainerDied","Data":"8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b"} Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.114676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22z5d" event={"ID":"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd","Type":"ContainerStarted","Data":"71a00f49d33b3edf908f1a0e64c861d8eae3005e2d02a049cf700f93eb35b848"} Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.120265 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4f39fae-0b71-4705-ba1d-712eff8ff21f","Type":"ContainerDied","Data":"c3c7c0f8e74241457368a8438fb0975d2dcbc6159e9ad3574ce932c0bdb0ce06"} Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.120300 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c7c0f8e74241457368a8438fb0975d2dcbc6159e9ad3574ce932c0bdb0ce06" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.120364 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.128705 4669 generic.go:334] "Generic (PLEG): container finished" podID="58d32a06-7489-43f3-8d30-8e47d19da27e" containerID="c27f4ae432b98653639932172aa90ad5ec763b38f16167081056e0c9cd2a9c08" exitCode=0 Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.128776 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58d32a06-7489-43f3-8d30-8e47d19da27e","Type":"ContainerDied","Data":"c27f4ae432b98653639932172aa90ad5ec763b38f16167081056e0c9cd2a9c08"} Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.432297 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.468308 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sgf44" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.597163 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-secret-volume\") pod \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.597368 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-config-volume\") pod \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.597551 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spk62\" (UniqueName: \"kubernetes.io/projected/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-kube-api-access-spk62\") pod \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\" (UID: \"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe\") " Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.598109 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" (UID: "a78d1dcb-341a-40b0-a96d-9ee5af65a2fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.615871 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-kube-api-access-spk62" (OuterVolumeSpecName: "kube-api-access-spk62") pod "a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" (UID: "a78d1dcb-341a-40b0-a96d-9ee5af65a2fe"). InnerVolumeSpecName "kube-api-access-spk62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.622090 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" (UID: "a78d1dcb-341a-40b0-a96d-9ee5af65a2fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.701398 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.701438 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spk62\" (UniqueName: \"kubernetes.io/projected/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-kube-api-access-spk62\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:06 crc kubenswrapper[4669]: I1008 20:47:06.701454 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.158034 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.158111 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4" event={"ID":"a78d1dcb-341a-40b0-a96d-9ee5af65a2fe","Type":"ContainerDied","Data":"565ac820321df6837a9ad9afdbebb752f36f36fc154228ae66337d6606e475a5"} Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.158169 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="565ac820321df6837a9ad9afdbebb752f36f36fc154228ae66337d6606e475a5" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.434981 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.622285 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58d32a06-7489-43f3-8d30-8e47d19da27e-kube-api-access\") pod \"58d32a06-7489-43f3-8d30-8e47d19da27e\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.622388 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58d32a06-7489-43f3-8d30-8e47d19da27e-kubelet-dir\") pod \"58d32a06-7489-43f3-8d30-8e47d19da27e\" (UID: \"58d32a06-7489-43f3-8d30-8e47d19da27e\") " Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.622711 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58d32a06-7489-43f3-8d30-8e47d19da27e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "58d32a06-7489-43f3-8d30-8e47d19da27e" (UID: "58d32a06-7489-43f3-8d30-8e47d19da27e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.644088 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d32a06-7489-43f3-8d30-8e47d19da27e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "58d32a06-7489-43f3-8d30-8e47d19da27e" (UID: "58d32a06-7489-43f3-8d30-8e47d19da27e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.723861 4669 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58d32a06-7489-43f3-8d30-8e47d19da27e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:07 crc kubenswrapper[4669]: I1008 20:47:07.723958 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58d32a06-7489-43f3-8d30-8e47d19da27e-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:08 crc kubenswrapper[4669]: I1008 20:47:08.179066 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 20:47:08 crc kubenswrapper[4669]: I1008 20:47:08.179053 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"58d32a06-7489-43f3-8d30-8e47d19da27e","Type":"ContainerDied","Data":"efe6940f3a8bf93ed0341a3acba8e2b2a030ff3087b1541d6806b66f3020474a"} Oct 08 20:47:08 crc kubenswrapper[4669]: I1008 20:47:08.179158 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe6940f3a8bf93ed0341a3acba8e2b2a030ff3087b1541d6806b66f3020474a" Oct 08 20:47:08 crc kubenswrapper[4669]: I1008 20:47:08.898713 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.264564 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.264627 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.266492 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.266576 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.283790 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.283887 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.284088 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.376695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.394827 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.409090 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:47:10 crc kubenswrapper[4669]: I1008 20:47:10.417258 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 20:47:13 crc kubenswrapper[4669]: I1008 20:47:13.185704 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:47:13 crc kubenswrapper[4669]: I1008 20:47:13.186070 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:47:14 crc kubenswrapper[4669]: I1008 20:47:14.119906 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:47:14 crc kubenswrapper[4669]: I1008 20:47:14.123840 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:47:14 crc kubenswrapper[4669]: I1008 20:47:14.142313 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-z99h8" Oct 08 20:47:15 crc kubenswrapper[4669]: W1008 20:47:15.194410 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-5e20b384efbe87fadaee0b17d51bb4934fccc3aac74c6016ad5c9dbe03f7e2b8 WatchSource:0}: Error finding container 5e20b384efbe87fadaee0b17d51bb4934fccc3aac74c6016ad5c9dbe03f7e2b8: Status 404 returned error can't find the container with id 5e20b384efbe87fadaee0b17d51bb4934fccc3aac74c6016ad5c9dbe03f7e2b8 Oct 08 20:47:15 crc kubenswrapper[4669]: I1008 20:47:15.232686 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ced8cb191abc1463ddc28533621cfd62b9d29938aa724c7a353db1264212a701"} Oct 08 20:47:15 crc kubenswrapper[4669]: I1008 20:47:15.233735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5e20b384efbe87fadaee0b17d51bb4934fccc3aac74c6016ad5c9dbe03f7e2b8"} Oct 08 20:47:16 crc kubenswrapper[4669]: I1008 20:47:16.240425 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ceb6c56b4926ab2b8babc51f7d7e1e7ce893c2e8a593cded18ed3d5a71e8ad15"} Oct 08 20:47:16 crc kubenswrapper[4669]: I1008 20:47:16.242051 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bb1b210072c643ca2d5386a3549fbed9e022b2dc5bf1fbdccc2e4e13ec3703fa"} Oct 08 20:47:16 crc kubenswrapper[4669]: I1008 20:47:16.242177 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a0c42187ef8d07722bc709f4bc78ada44c24715bd8dd61a331543d8910601b8a"} Oct 08 20:47:16 crc kubenswrapper[4669]: I1008 20:47:16.242291 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:47:16 crc kubenswrapper[4669]: I1008 20:47:16.243839 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e318dbdd8d1963bbc5c6ddfe9b17dd380e649370489b7aa67b3558151f369c78"} Oct 08 20:47:22 crc kubenswrapper[4669]: I1008 20:47:22.937970 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:47:25 crc kubenswrapper[4669]: I1008 20:47:25.803599 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:47:25 crc kubenswrapper[4669]: I1008 20:47:25.811419 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f90eed21-8bc2-4723-b6be-a672669a36fb-metrics-certs\") pod \"network-metrics-daemon-ml9vv\" (UID: \"f90eed21-8bc2-4723-b6be-a672669a36fb\") " pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:47:25 crc kubenswrapper[4669]: I1008 20:47:25.986129 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ml9vv" Oct 08 20:47:31 crc kubenswrapper[4669]: E1008 20:47:31.013987 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 20:47:31 crc kubenswrapper[4669]: E1008 20:47:31.014571 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7j9fx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8cmp8_openshift-marketplace(2d3175d3-ec70-498c-a243-dc5ab9b1efac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 20:47:31 crc kubenswrapper[4669]: E1008 20:47:31.015837 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8cmp8" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" Oct 08 20:47:31 crc kubenswrapper[4669]: E1008 20:47:31.673939 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 20:47:31 crc kubenswrapper[4669]: E1008 20:47:31.674113 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p97g9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s7xfl_openshift-marketplace(17f131f8-064e-407f-affb-af300e3a5867): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 20:47:31 crc kubenswrapper[4669]: E1008 20:47:31.675274 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s7xfl" podUID="17f131f8-064e-407f-affb-af300e3a5867" Oct 08 20:47:32 crc kubenswrapper[4669]: E1008 20:47:32.343991 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s7xfl" podUID="17f131f8-064e-407f-affb-af300e3a5867" Oct 08 20:47:32 crc kubenswrapper[4669]: E1008 20:47:32.343974 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8cmp8" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" Oct 08 20:47:32 crc kubenswrapper[4669]: E1008 20:47:32.494187 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 20:47:32 crc kubenswrapper[4669]: E1008 20:47:32.494316 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9k8rq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t7lxk_openshift-marketplace(d5920535-6f50-4fd4-a0c6-2dd4943ad7d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 20:47:32 crc kubenswrapper[4669]: E1008 20:47:32.495492 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t7lxk" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" Oct 08 20:47:34 crc kubenswrapper[4669]: I1008 20:47:34.876834 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4485l" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.894175 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t7lxk" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.985583 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.986057 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxh64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f4nl8_openshift-marketplace(ad4dc180-c88e-4650-8676-0a65909d8abb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.987496 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f4nl8" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.998000 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.998145 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmb9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zfsm7_openshift-marketplace(52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 20:47:34 crc kubenswrapper[4669]: E1008 20:47:34.999391 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zfsm7" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" Oct 08 20:47:38 crc kubenswrapper[4669]: E1008 20:47:38.324365 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zfsm7" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" Oct 08 20:47:38 crc kubenswrapper[4669]: E1008 20:47:38.324905 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f4nl8" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" Oct 08 20:47:38 crc kubenswrapper[4669]: I1008 20:47:38.718025 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ml9vv"] Oct 08 20:47:38 crc kubenswrapper[4669]: W1008 20:47:38.727691 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90eed21_8bc2_4723_b6be_a672669a36fb.slice/crio-a61e6a09ba7f416d4e76c0d9508560551a7a631b72b1f648f056a0ef18f41c61 WatchSource:0}: Error finding container a61e6a09ba7f416d4e76c0d9508560551a7a631b72b1f648f056a0ef18f41c61: Status 404 returned error can't find the container with id a61e6a09ba7f416d4e76c0d9508560551a7a631b72b1f648f056a0ef18f41c61 Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.376334 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" event={"ID":"f90eed21-8bc2-4723-b6be-a672669a36fb","Type":"ContainerStarted","Data":"84cd106d41eda6095ae15486da6d6969867a6e051581224b8cba4323e7d042ac"} Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.376731 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" event={"ID":"f90eed21-8bc2-4723-b6be-a672669a36fb","Type":"ContainerStarted","Data":"b5efa4cd2f8981d3c75a72b1b573b33d4c78fcef26f888963b2b5fdca1559ee4"} Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.376749 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ml9vv" event={"ID":"f90eed21-8bc2-4723-b6be-a672669a36fb","Type":"ContainerStarted","Data":"a61e6a09ba7f416d4e76c0d9508560551a7a631b72b1f648f056a0ef18f41c61"} Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.379612 4669 generic.go:334] "Generic (PLEG): container finished" podID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerID="33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76" exitCode=0 Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.379710 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zv6v" event={"ID":"e6c956ef-8cee-4729-aa58-d053415e8d1c","Type":"ContainerDied","Data":"33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76"} Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.383378 4669 generic.go:334] "Generic (PLEG): container finished" podID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerID="209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4" exitCode=0 Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.383497 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22z5d" event={"ID":"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd","Type":"ContainerDied","Data":"209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4"} Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.385406 4669 generic.go:334] "Generic (PLEG): container finished" podID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerID="ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5" exitCode=0 Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.385444 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2vx" event={"ID":"1037d4ad-c7a9-4e59-bc6e-9a26d6504430","Type":"ContainerDied","Data":"ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5"} Oct 08 20:47:39 crc kubenswrapper[4669]: I1008 20:47:39.402572 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ml9vv" podStartSLOduration=157.40248502 podStartE2EDuration="2m37.40248502s" podCreationTimestamp="2025-10-08 20:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:47:39.39959054 +0000 UTC m=+179.092401233" watchObservedRunningTime="2025-10-08 20:47:39.40248502 +0000 UTC m=+179.095295703" Oct 08 20:47:40 crc kubenswrapper[4669]: I1008 20:47:40.396966 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zv6v" event={"ID":"e6c956ef-8cee-4729-aa58-d053415e8d1c","Type":"ContainerStarted","Data":"47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8"} Oct 08 20:47:40 crc kubenswrapper[4669]: I1008 20:47:40.402869 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22z5d" event={"ID":"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd","Type":"ContainerStarted","Data":"04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c"} Oct 08 20:47:40 crc kubenswrapper[4669]: I1008 20:47:40.428255 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4zv6v" podStartSLOduration=2.522524287 podStartE2EDuration="39.428232839s" podCreationTimestamp="2025-10-08 20:47:01 +0000 UTC" firstStartedPulling="2025-10-08 20:47:02.988188353 +0000 UTC m=+142.680999036" lastFinishedPulling="2025-10-08 20:47:39.893896905 +0000 UTC m=+179.586707588" observedRunningTime="2025-10-08 20:47:40.427453728 +0000 UTC m=+180.120264461" watchObservedRunningTime="2025-10-08 20:47:40.428232839 +0000 UTC m=+180.121043522" Oct 08 20:47:40 crc kubenswrapper[4669]: I1008 20:47:40.452211 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-22z5d" podStartSLOduration=2.725697846 podStartE2EDuration="36.452188603s" podCreationTimestamp="2025-10-08 20:47:04 +0000 UTC" firstStartedPulling="2025-10-08 20:47:06.116013246 +0000 UTC m=+145.808823919" lastFinishedPulling="2025-10-08 20:47:39.842503983 +0000 UTC m=+179.535314676" observedRunningTime="2025-10-08 20:47:40.447829502 +0000 UTC m=+180.140640185" watchObservedRunningTime="2025-10-08 20:47:40.452188603 +0000 UTC m=+180.144999296" Oct 08 20:47:41 crc kubenswrapper[4669]: I1008 20:47:41.421541 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2vx" event={"ID":"1037d4ad-c7a9-4e59-bc6e-9a26d6504430","Type":"ContainerStarted","Data":"159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042"} Oct 08 20:47:41 crc kubenswrapper[4669]: I1008 20:47:41.458306 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tb2vx" podStartSLOduration=3.319303461 podStartE2EDuration="37.458278427s" podCreationTimestamp="2025-10-08 20:47:04 +0000 UTC" firstStartedPulling="2025-10-08 20:47:06.113743483 +0000 UTC m=+145.806554156" lastFinishedPulling="2025-10-08 20:47:40.252718449 +0000 UTC m=+179.945529122" observedRunningTime="2025-10-08 20:47:41.449122594 +0000 UTC m=+181.141933267" watchObservedRunningTime="2025-10-08 20:47:41.458278427 +0000 UTC m=+181.151089120" Oct 08 20:47:41 crc kubenswrapper[4669]: I1008 20:47:41.857093 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:41 crc kubenswrapper[4669]: I1008 20:47:41.857166 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:42 crc kubenswrapper[4669]: I1008 20:47:42.979340 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4zv6v" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="registry-server" probeResult="failure" output=< Oct 08 20:47:42 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 20:47:42 crc kubenswrapper[4669]: > Oct 08 20:47:43 crc kubenswrapper[4669]: I1008 20:47:43.185228 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:47:43 crc kubenswrapper[4669]: I1008 20:47:43.185657 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:47:44 crc kubenswrapper[4669]: I1008 20:47:44.665312 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:44 crc kubenswrapper[4669]: I1008 20:47:44.665368 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.054981 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.055327 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.443896 4669 generic.go:334] "Generic (PLEG): container finished" podID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerID="9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7" exitCode=0 Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.443946 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cmp8" event={"ID":"2d3175d3-ec70-498c-a243-dc5ab9b1efac","Type":"ContainerDied","Data":"9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7"} Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.446587 4669 generic.go:334] "Generic (PLEG): container finished" podID="17f131f8-064e-407f-affb-af300e3a5867" containerID="5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08" exitCode=0 Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.446616 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7xfl" event={"ID":"17f131f8-064e-407f-affb-af300e3a5867","Type":"ContainerDied","Data":"5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08"} Oct 08 20:47:45 crc kubenswrapper[4669]: I1008 20:47:45.705911 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tb2vx" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="registry-server" probeResult="failure" output=< Oct 08 20:47:45 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 20:47:45 crc kubenswrapper[4669]: > Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.097633 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-22z5d" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="registry-server" probeResult="failure" output=< Oct 08 20:47:46 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 20:47:46 crc kubenswrapper[4669]: > Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.453613 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cmp8" event={"ID":"2d3175d3-ec70-498c-a243-dc5ab9b1efac","Type":"ContainerStarted","Data":"7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb"} Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.456583 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerID="54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748" exitCode=0 Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.456649 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7lxk" event={"ID":"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7","Type":"ContainerDied","Data":"54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748"} Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.461123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7xfl" event={"ID":"17f131f8-064e-407f-affb-af300e3a5867","Type":"ContainerStarted","Data":"8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587"} Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.497176 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8cmp8" podStartSLOduration=2.248249217 podStartE2EDuration="45.497159915s" podCreationTimestamp="2025-10-08 20:47:01 +0000 UTC" firstStartedPulling="2025-10-08 20:47:03.017846957 +0000 UTC m=+142.710657630" lastFinishedPulling="2025-10-08 20:47:46.266757655 +0000 UTC m=+185.959568328" observedRunningTime="2025-10-08 20:47:46.478498478 +0000 UTC m=+186.171309151" watchObservedRunningTime="2025-10-08 20:47:46.497159915 +0000 UTC m=+186.189970588" Oct 08 20:47:46 crc kubenswrapper[4669]: I1008 20:47:46.510073 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s7xfl" podStartSLOduration=2.69302936 podStartE2EDuration="45.510056401s" podCreationTimestamp="2025-10-08 20:47:01 +0000 UTC" firstStartedPulling="2025-10-08 20:47:03.054675597 +0000 UTC m=+142.747486270" lastFinishedPulling="2025-10-08 20:47:45.871702638 +0000 UTC m=+185.564513311" observedRunningTime="2025-10-08 20:47:46.508607572 +0000 UTC m=+186.201418255" watchObservedRunningTime="2025-10-08 20:47:46.510056401 +0000 UTC m=+186.202867074" Oct 08 20:47:48 crc kubenswrapper[4669]: I1008 20:47:48.494523 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7lxk" event={"ID":"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7","Type":"ContainerStarted","Data":"325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267"} Oct 08 20:47:50 crc kubenswrapper[4669]: I1008 20:47:50.417599 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 20:47:50 crc kubenswrapper[4669]: I1008 20:47:50.435990 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t7lxk" podStartSLOduration=5.030698238 podStartE2EDuration="49.435971956s" podCreationTimestamp="2025-10-08 20:47:01 +0000 UTC" firstStartedPulling="2025-10-08 20:47:03.014204872 +0000 UTC m=+142.707015545" lastFinishedPulling="2025-10-08 20:47:47.41947857 +0000 UTC m=+187.112289263" observedRunningTime="2025-10-08 20:47:48.532945877 +0000 UTC m=+188.225756600" watchObservedRunningTime="2025-10-08 20:47:50.435971956 +0000 UTC m=+190.128782639" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.502501 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.502666 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.606505 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.633152 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.633265 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.654877 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.677228 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.897664 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:51 crc kubenswrapper[4669]: I1008 20:47:51.945630 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:52 crc kubenswrapper[4669]: I1008 20:47:52.077471 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:52 crc kubenswrapper[4669]: I1008 20:47:52.078516 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:52 crc kubenswrapper[4669]: I1008 20:47:52.135697 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:52 crc kubenswrapper[4669]: I1008 20:47:52.568877 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:47:52 crc kubenswrapper[4669]: I1008 20:47:52.578828 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:53 crc kubenswrapper[4669]: I1008 20:47:53.837375 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zv6v"] Oct 08 20:47:53 crc kubenswrapper[4669]: I1008 20:47:53.838059 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4zv6v" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="registry-server" containerID="cri-o://47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8" gracePeriod=2 Oct 08 20:47:54 crc kubenswrapper[4669]: I1008 20:47:54.703382 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:54 crc kubenswrapper[4669]: I1008 20:47:54.741447 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.113135 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.174453 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.455309 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.550210 4669 generic.go:334] "Generic (PLEG): container finished" podID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerID="47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8" exitCode=0 Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.550980 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zv6v" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.551904 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zv6v" event={"ID":"e6c956ef-8cee-4729-aa58-d053415e8d1c","Type":"ContainerDied","Data":"47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8"} Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.551954 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zv6v" event={"ID":"e6c956ef-8cee-4729-aa58-d053415e8d1c","Type":"ContainerDied","Data":"ccf0e6a9bfa12d3804d6517b4af9d03df21aa510167a52987ad72c36141b6c5d"} Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.551976 4669 scope.go:117] "RemoveContainer" containerID="47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.577144 4669 scope.go:117] "RemoveContainer" containerID="33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.599087 4669 scope.go:117] "RemoveContainer" containerID="9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.623575 4669 scope.go:117] "RemoveContainer" containerID="47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8" Oct 08 20:47:55 crc kubenswrapper[4669]: E1008 20:47:55.624607 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8\": container with ID starting with 47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8 not found: ID does not exist" containerID="47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.624646 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8"} err="failed to get container status \"47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8\": rpc error: code = NotFound desc = could not find container \"47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8\": container with ID starting with 47b556a28be66294c1ba2ecdb1946f3249770d000201f8e8243e7c5d01314fb8 not found: ID does not exist" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.624705 4669 scope.go:117] "RemoveContainer" containerID="33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.625143 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpkwj\" (UniqueName: \"kubernetes.io/projected/e6c956ef-8cee-4729-aa58-d053415e8d1c-kube-api-access-hpkwj\") pod \"e6c956ef-8cee-4729-aa58-d053415e8d1c\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.625226 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-catalog-content\") pod \"e6c956ef-8cee-4729-aa58-d053415e8d1c\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.625280 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-utilities\") pod \"e6c956ef-8cee-4729-aa58-d053415e8d1c\" (UID: \"e6c956ef-8cee-4729-aa58-d053415e8d1c\") " Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.626281 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-utilities" (OuterVolumeSpecName: "utilities") pod "e6c956ef-8cee-4729-aa58-d053415e8d1c" (UID: "e6c956ef-8cee-4729-aa58-d053415e8d1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:47:55 crc kubenswrapper[4669]: E1008 20:47:55.626877 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76\": container with ID starting with 33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76 not found: ID does not exist" containerID="33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.626934 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76"} err="failed to get container status \"33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76\": rpc error: code = NotFound desc = could not find container \"33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76\": container with ID starting with 33978c9845708dcdb4bb9939b8397d34957770a7a18ece3aec80731060c3ef76 not found: ID does not exist" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.626973 4669 scope.go:117] "RemoveContainer" containerID="9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542" Oct 08 20:47:55 crc kubenswrapper[4669]: E1008 20:47:55.636733 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542\": container with ID starting with 9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542 not found: ID does not exist" containerID="9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.636802 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542"} err="failed to get container status \"9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542\": rpc error: code = NotFound desc = could not find container \"9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542\": container with ID starting with 9ef9d0c4820aa4a376b73d40843db9ce5269bef4071703b0ec49a7edc03c0542 not found: ID does not exist" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.638322 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7lxk"] Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.638561 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t7lxk" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="registry-server" containerID="cri-o://325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267" gracePeriod=2 Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.643739 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c956ef-8cee-4729-aa58-d053415e8d1c-kube-api-access-hpkwj" (OuterVolumeSpecName: "kube-api-access-hpkwj") pod "e6c956ef-8cee-4729-aa58-d053415e8d1c" (UID: "e6c956ef-8cee-4729-aa58-d053415e8d1c"). InnerVolumeSpecName "kube-api-access-hpkwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.688041 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6c956ef-8cee-4729-aa58-d053415e8d1c" (UID: "e6c956ef-8cee-4729-aa58-d053415e8d1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.728062 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpkwj\" (UniqueName: \"kubernetes.io/projected/e6c956ef-8cee-4729-aa58-d053415e8d1c-kube-api-access-hpkwj\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.728106 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.728120 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6c956ef-8cee-4729-aa58-d053415e8d1c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.919214 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zv6v"] Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.922046 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4zv6v"] Oct 08 20:47:55 crc kubenswrapper[4669]: I1008 20:47:55.943759 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.134201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-catalog-content\") pod \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.134614 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-utilities\") pod \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.134761 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k8rq\" (UniqueName: \"kubernetes.io/projected/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-kube-api-access-9k8rq\") pod \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\" (UID: \"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7\") " Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.135731 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-utilities" (OuterVolumeSpecName: "utilities") pod "d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" (UID: "d5920535-6f50-4fd4-a0c6-2dd4943ad7d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.141099 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-kube-api-access-9k8rq" (OuterVolumeSpecName: "kube-api-access-9k8rq") pod "d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" (UID: "d5920535-6f50-4fd4-a0c6-2dd4943ad7d7"). InnerVolumeSpecName "kube-api-access-9k8rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.195752 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" (UID: "d5920535-6f50-4fd4-a0c6-2dd4943ad7d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.235955 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.235991 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.236004 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k8rq\" (UniqueName: \"kubernetes.io/projected/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7-kube-api-access-9k8rq\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.569501 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerID="325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267" exitCode=0 Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.569635 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7lxk" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.569628 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7lxk" event={"ID":"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7","Type":"ContainerDied","Data":"325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267"} Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.569697 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7lxk" event={"ID":"d5920535-6f50-4fd4-a0c6-2dd4943ad7d7","Type":"ContainerDied","Data":"1e41deff895cc5d481a34e7e17bc20101ffc56d3652271b8659f79e74f8505a0"} Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.569725 4669 scope.go:117] "RemoveContainer" containerID="325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.574088 4669 generic.go:334] "Generic (PLEG): container finished" podID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerID="6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345" exitCode=0 Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.574167 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfsm7" event={"ID":"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d","Type":"ContainerDied","Data":"6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345"} Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.577088 4669 generic.go:334] "Generic (PLEG): container finished" podID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerID="d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956" exitCode=0 Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.577131 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nl8" event={"ID":"ad4dc180-c88e-4650-8676-0a65909d8abb","Type":"ContainerDied","Data":"d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956"} Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.596935 4669 scope.go:117] "RemoveContainer" containerID="54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.630907 4669 scope.go:117] "RemoveContainer" containerID="b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.635020 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7lxk"] Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.639086 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t7lxk"] Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.650746 4669 scope.go:117] "RemoveContainer" containerID="325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267" Oct 08 20:47:56 crc kubenswrapper[4669]: E1008 20:47:56.652893 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267\": container with ID starting with 325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267 not found: ID does not exist" containerID="325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.652927 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267"} err="failed to get container status \"325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267\": rpc error: code = NotFound desc = could not find container \"325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267\": container with ID starting with 325dee95d6c07d70478f6fc2415c330961d50d342119b1a601a96ad8db113267 not found: ID does not exist" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.652949 4669 scope.go:117] "RemoveContainer" containerID="54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748" Oct 08 20:47:56 crc kubenswrapper[4669]: E1008 20:47:56.653227 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748\": container with ID starting with 54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748 not found: ID does not exist" containerID="54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.653248 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748"} err="failed to get container status \"54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748\": rpc error: code = NotFound desc = could not find container \"54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748\": container with ID starting with 54a95a16d76b3096fdf299e8e8eb69906f6d5be02cdb3d8404f4947af6931748 not found: ID does not exist" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.653262 4669 scope.go:117] "RemoveContainer" containerID="b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b" Oct 08 20:47:56 crc kubenswrapper[4669]: E1008 20:47:56.653461 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b\": container with ID starting with b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b not found: ID does not exist" containerID="b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b" Oct 08 20:47:56 crc kubenswrapper[4669]: I1008 20:47:56.653487 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b"} err="failed to get container status \"b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b\": rpc error: code = NotFound desc = could not find container \"b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b\": container with ID starting with b8ce296cae82787a988960c1900587f01c2a94a10c8024f3ab146185db7fff5b not found: ID does not exist" Oct 08 20:47:57 crc kubenswrapper[4669]: I1008 20:47:57.338317 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" path="/var/lib/kubelet/pods/d5920535-6f50-4fd4-a0c6-2dd4943ad7d7/volumes" Oct 08 20:47:57 crc kubenswrapper[4669]: I1008 20:47:57.339728 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" path="/var/lib/kubelet/pods/e6c956ef-8cee-4729-aa58-d053415e8d1c/volumes" Oct 08 20:47:57 crc kubenswrapper[4669]: I1008 20:47:57.583837 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfsm7" event={"ID":"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d","Type":"ContainerStarted","Data":"2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd"} Oct 08 20:47:57 crc kubenswrapper[4669]: I1008 20:47:57.587105 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nl8" event={"ID":"ad4dc180-c88e-4650-8676-0a65909d8abb","Type":"ContainerStarted","Data":"cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01"} Oct 08 20:47:57 crc kubenswrapper[4669]: I1008 20:47:57.600606 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zfsm7" podStartSLOduration=1.579091852 podStartE2EDuration="54.60059236s" podCreationTimestamp="2025-10-08 20:47:03 +0000 UTC" firstStartedPulling="2025-10-08 20:47:04.038745323 +0000 UTC m=+143.731556006" lastFinishedPulling="2025-10-08 20:47:57.060245841 +0000 UTC m=+196.753056514" observedRunningTime="2025-10-08 20:47:57.599546111 +0000 UTC m=+197.292356794" watchObservedRunningTime="2025-10-08 20:47:57.60059236 +0000 UTC m=+197.293403033" Oct 08 20:47:57 crc kubenswrapper[4669]: I1008 20:47:57.623076 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f4nl8" podStartSLOduration=2.646985432 podStartE2EDuration="54.623057405s" podCreationTimestamp="2025-10-08 20:47:03 +0000 UTC" firstStartedPulling="2025-10-08 20:47:05.053314082 +0000 UTC m=+144.746124755" lastFinishedPulling="2025-10-08 20:47:57.029386055 +0000 UTC m=+196.722196728" observedRunningTime="2025-10-08 20:47:57.619333814 +0000 UTC m=+197.312144487" watchObservedRunningTime="2025-10-08 20:47:57.623057405 +0000 UTC m=+197.315868078" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.035228 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22z5d"] Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.036577 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-22z5d" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="registry-server" containerID="cri-o://04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c" gracePeriod=2 Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.376118 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.563763 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-utilities\") pod \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.563859 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vmq5\" (UniqueName: \"kubernetes.io/projected/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-kube-api-access-8vmq5\") pod \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.563895 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-catalog-content\") pod \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\" (UID: \"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd\") " Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.564766 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-utilities" (OuterVolumeSpecName: "utilities") pod "747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" (UID: "747cdaf9-bde5-4baa-8b58-1e1e9e882ddd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.570795 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-kube-api-access-8vmq5" (OuterVolumeSpecName: "kube-api-access-8vmq5") pod "747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" (UID: "747cdaf9-bde5-4baa-8b58-1e1e9e882ddd"). InnerVolumeSpecName "kube-api-access-8vmq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.593302 4669 generic.go:334] "Generic (PLEG): container finished" podID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerID="04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c" exitCode=0 Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.593345 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22z5d" event={"ID":"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd","Type":"ContainerDied","Data":"04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c"} Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.593371 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22z5d" event={"ID":"747cdaf9-bde5-4baa-8b58-1e1e9e882ddd","Type":"ContainerDied","Data":"71a00f49d33b3edf908f1a0e64c861d8eae3005e2d02a049cf700f93eb35b848"} Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.593385 4669 scope.go:117] "RemoveContainer" containerID="04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.593380 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22z5d" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.614918 4669 scope.go:117] "RemoveContainer" containerID="209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.633777 4669 scope.go:117] "RemoveContainer" containerID="8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.640955 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" (UID: "747cdaf9-bde5-4baa-8b58-1e1e9e882ddd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.656481 4669 scope.go:117] "RemoveContainer" containerID="04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c" Oct 08 20:47:58 crc kubenswrapper[4669]: E1008 20:47:58.657022 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c\": container with ID starting with 04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c not found: ID does not exist" containerID="04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.657058 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c"} err="failed to get container status \"04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c\": rpc error: code = NotFound desc = could not find container \"04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c\": container with ID starting with 04c584d30eeaf412d005a9516d617dd72422fa4a26d38ef53dc53dfac56ed19c not found: ID does not exist" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.657086 4669 scope.go:117] "RemoveContainer" containerID="209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4" Oct 08 20:47:58 crc kubenswrapper[4669]: E1008 20:47:58.657550 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4\": container with ID starting with 209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4 not found: ID does not exist" containerID="209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.657596 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4"} err="failed to get container status \"209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4\": rpc error: code = NotFound desc = could not find container \"209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4\": container with ID starting with 209c43ef2d843d10eaf9d817c33e26ae5d16e950392fdef21f73e37913c0b8a4 not found: ID does not exist" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.657625 4669 scope.go:117] "RemoveContainer" containerID="8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b" Oct 08 20:47:58 crc kubenswrapper[4669]: E1008 20:47:58.657920 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b\": container with ID starting with 8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b not found: ID does not exist" containerID="8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.657941 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b"} err="failed to get container status \"8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b\": rpc error: code = NotFound desc = could not find container \"8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b\": container with ID starting with 8b7f915a5314f221a7112fbd5f0d88d02be21a21803cc8d04376011ef700df0b not found: ID does not exist" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.665520 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vmq5\" (UniqueName: \"kubernetes.io/projected/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-kube-api-access-8vmq5\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.665580 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.665596 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.925920 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22z5d"] Oct 08 20:47:58 crc kubenswrapper[4669]: I1008 20:47:58.928196 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-22z5d"] Oct 08 20:47:59 crc kubenswrapper[4669]: I1008 20:47:59.337756 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" path="/var/lib/kubelet/pods/747cdaf9-bde5-4baa-8b58-1e1e9e882ddd/volumes" Oct 08 20:48:03 crc kubenswrapper[4669]: I1008 20:48:03.635229 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:48:03 crc kubenswrapper[4669]: I1008 20:48:03.635814 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:48:03 crc kubenswrapper[4669]: I1008 20:48:03.674133 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:48:04 crc kubenswrapper[4669]: I1008 20:48:04.048725 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:48:04 crc kubenswrapper[4669]: I1008 20:48:04.048869 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:48:04 crc kubenswrapper[4669]: I1008 20:48:04.094037 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:48:04 crc kubenswrapper[4669]: I1008 20:48:04.658490 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:48:04 crc kubenswrapper[4669]: I1008 20:48:04.659006 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:48:05 crc kubenswrapper[4669]: I1008 20:48:05.241749 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nl8"] Oct 08 20:48:06 crc kubenswrapper[4669]: I1008 20:48:06.633831 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f4nl8" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="registry-server" containerID="cri-o://cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01" gracePeriod=2 Oct 08 20:48:06 crc kubenswrapper[4669]: I1008 20:48:06.982261 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.075857 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-catalog-content\") pod \"ad4dc180-c88e-4650-8676-0a65909d8abb\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.075909 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-utilities\") pod \"ad4dc180-c88e-4650-8676-0a65909d8abb\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.075968 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxh64\" (UniqueName: \"kubernetes.io/projected/ad4dc180-c88e-4650-8676-0a65909d8abb-kube-api-access-dxh64\") pod \"ad4dc180-c88e-4650-8676-0a65909d8abb\" (UID: \"ad4dc180-c88e-4650-8676-0a65909d8abb\") " Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.076800 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-utilities" (OuterVolumeSpecName: "utilities") pod "ad4dc180-c88e-4650-8676-0a65909d8abb" (UID: "ad4dc180-c88e-4650-8676-0a65909d8abb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.080125 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4dc180-c88e-4650-8676-0a65909d8abb-kube-api-access-dxh64" (OuterVolumeSpecName: "kube-api-access-dxh64") pod "ad4dc180-c88e-4650-8676-0a65909d8abb" (UID: "ad4dc180-c88e-4650-8676-0a65909d8abb"). InnerVolumeSpecName "kube-api-access-dxh64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.087895 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad4dc180-c88e-4650-8676-0a65909d8abb" (UID: "ad4dc180-c88e-4650-8676-0a65909d8abb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.176851 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxh64\" (UniqueName: \"kubernetes.io/projected/ad4dc180-c88e-4650-8676-0a65909d8abb-kube-api-access-dxh64\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.176892 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.176906 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4dc180-c88e-4650-8676-0a65909d8abb-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.640819 4669 generic.go:334] "Generic (PLEG): container finished" podID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerID="cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01" exitCode=0 Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.640888 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4nl8" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.640924 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nl8" event={"ID":"ad4dc180-c88e-4650-8676-0a65909d8abb","Type":"ContainerDied","Data":"cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01"} Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.640993 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nl8" event={"ID":"ad4dc180-c88e-4650-8676-0a65909d8abb","Type":"ContainerDied","Data":"029a40de032fb83491d59044945cd6851ad94bc337391c07fa4d8d5d5dbe6942"} Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.641018 4669 scope.go:117] "RemoveContainer" containerID="cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.656968 4669 scope.go:117] "RemoveContainer" containerID="d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.675618 4669 scope.go:117] "RemoveContainer" containerID="fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.676218 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nl8"] Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.679324 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nl8"] Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.692190 4669 scope.go:117] "RemoveContainer" containerID="cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01" Oct 08 20:48:07 crc kubenswrapper[4669]: E1008 20:48:07.692666 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01\": container with ID starting with cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01 not found: ID does not exist" containerID="cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.692708 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01"} err="failed to get container status \"cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01\": rpc error: code = NotFound desc = could not find container \"cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01\": container with ID starting with cd0b74bd0d78e39dca0ff497f3219ae2eefba188ca3b89ce333b5cb6ea39bd01 not found: ID does not exist" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.692737 4669 scope.go:117] "RemoveContainer" containerID="d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956" Oct 08 20:48:07 crc kubenswrapper[4669]: E1008 20:48:07.693109 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956\": container with ID starting with d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956 not found: ID does not exist" containerID="d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.693139 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956"} err="failed to get container status \"d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956\": rpc error: code = NotFound desc = could not find container \"d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956\": container with ID starting with d5b0aa8510d2cdf8c92e2d5af221ae53355d2b7e30d2ef55fdcb31322ba76956 not found: ID does not exist" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.693153 4669 scope.go:117] "RemoveContainer" containerID="fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55" Oct 08 20:48:07 crc kubenswrapper[4669]: E1008 20:48:07.693416 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55\": container with ID starting with fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55 not found: ID does not exist" containerID="fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55" Oct 08 20:48:07 crc kubenswrapper[4669]: I1008 20:48:07.693446 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55"} err="failed to get container status \"fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55\": rpc error: code = NotFound desc = could not find container \"fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55\": container with ID starting with fb7fb73b2cebc5215ed2255010f79e26cfd6447f00a334c240a2a0bab4708b55 not found: ID does not exist" Oct 08 20:48:09 crc kubenswrapper[4669]: I1008 20:48:09.337261 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" path="/var/lib/kubelet/pods/ad4dc180-c88e-4650-8676-0a65909d8abb/volumes" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.853481 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-t9x9z"] Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.853952 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.853967 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.853984 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.853992 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854005 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854013 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854027 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854035 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854047 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854055 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854073 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f39fae-0b71-4705-ba1d-712eff8ff21f" containerName="pruner" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854081 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f39fae-0b71-4705-ba1d-712eff8ff21f" containerName="pruner" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854094 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854101 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854113 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854121 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854132 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" containerName="collect-profiles" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854140 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" containerName="collect-profiles" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854150 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854158 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="extract-content" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854166 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854175 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854186 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854194 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854206 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854213 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854222 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d32a06-7489-43f3-8d30-8e47d19da27e" containerName="pruner" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854230 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d32a06-7489-43f3-8d30-8e47d19da27e" containerName="pruner" Oct 08 20:48:10 crc kubenswrapper[4669]: E1008 20:48:10.854240 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854248 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="extract-utilities" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854368 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" containerName="collect-profiles" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854384 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4dc180-c88e-4650-8676-0a65909d8abb" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854399 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5920535-6f50-4fd4-a0c6-2dd4943ad7d7" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854410 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c956ef-8cee-4729-aa58-d053415e8d1c" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854418 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d32a06-7489-43f3-8d30-8e47d19da27e" containerName="pruner" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854428 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="747cdaf9-bde5-4baa-8b58-1e1e9e882ddd" containerName="registry-server" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854441 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f39fae-0b71-4705-ba1d-712eff8ff21f" containerName="pruner" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.854920 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:10 crc kubenswrapper[4669]: I1008 20:48:10.872110 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-t9x9z"] Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017551 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017614 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017637 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpkfn\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-kube-api-access-lpkfn\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017658 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-registry-certificates\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017681 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-registry-tls\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017699 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-trusted-ca\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017748 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.017769 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-bound-sa-token\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.037774 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.118949 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.119315 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-bound-sa-token\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.119495 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.119646 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpkfn\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-kube-api-access-lpkfn\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.119785 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-registry-certificates\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.119913 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-registry-tls\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.120032 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-trusted-ca\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.119914 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.121025 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-trusted-ca\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.121037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-registry-certificates\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.124888 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.124935 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-registry-tls\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.141306 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpkfn\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-kube-api-access-lpkfn\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.141307 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7-bound-sa-token\") pod \"image-registry-66df7c8f76-t9x9z\" (UID: \"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7\") " pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.176196 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.582978 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-t9x9z"] Oct 08 20:48:11 crc kubenswrapper[4669]: W1008 20:48:11.589691 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea5f5b3_ab35_4a4e_b9a7_50a9f6b6f5d7.slice/crio-25e8815f356315dba3b6329eea4bbcb7141d7940ef45a21348fedd7ed9d50ab2 WatchSource:0}: Error finding container 25e8815f356315dba3b6329eea4bbcb7141d7940ef45a21348fedd7ed9d50ab2: Status 404 returned error can't find the container with id 25e8815f356315dba3b6329eea4bbcb7141d7940ef45a21348fedd7ed9d50ab2 Oct 08 20:48:11 crc kubenswrapper[4669]: I1008 20:48:11.662145 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" event={"ID":"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7","Type":"ContainerStarted","Data":"25e8815f356315dba3b6329eea4bbcb7141d7940ef45a21348fedd7ed9d50ab2"} Oct 08 20:48:12 crc kubenswrapper[4669]: I1008 20:48:12.667803 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" event={"ID":"bea5f5b3-ab35-4a4e-b9a7-50a9f6b6f5d7","Type":"ContainerStarted","Data":"9602395ef68f4e823ed4368a1108dfb8ce7c872d66d0dd074346009621f61b37"} Oct 08 20:48:12 crc kubenswrapper[4669]: I1008 20:48:12.668374 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.185981 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.186042 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.186085 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.186632 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.186687 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5" gracePeriod=600 Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.678095 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5" exitCode=0 Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.678170 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5"} Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.679499 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"24bba86ffc0208cf280d07da074223933749a4ce672ce4bf5741aa000c003ce6"} Oct 08 20:48:13 crc kubenswrapper[4669]: I1008 20:48:13.699207 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" podStartSLOduration=3.6991903710000003 podStartE2EDuration="3.699190371s" podCreationTimestamp="2025-10-08 20:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:48:12.69243402 +0000 UTC m=+212.385244703" watchObservedRunningTime="2025-10-08 20:48:13.699190371 +0000 UTC m=+213.392001044" Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.818668 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s7xfl"] Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.819355 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s7xfl" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="registry-server" containerID="cri-o://8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587" gracePeriod=30 Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.825203 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cmp8"] Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.825488 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8cmp8" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="registry-server" containerID="cri-o://7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb" gracePeriod=30 Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.836309 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4knb8"] Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.836537 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" containerID="cri-o://3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c" gracePeriod=30 Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.846505 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfsm7"] Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.846782 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zfsm7" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="registry-server" containerID="cri-o://2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd" gracePeriod=30 Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.865250 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kjxjv"] Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.865981 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.870829 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tb2vx"] Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.871129 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tb2vx" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="registry-server" containerID="cri-o://159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042" gracePeriod=30 Oct 08 20:48:19 crc kubenswrapper[4669]: I1008 20:48:19.873493 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kjxjv"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.037879 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.037953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk8lv\" (UniqueName: \"kubernetes.io/projected/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-kube-api-access-xk8lv\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.038020 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.139026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.139069 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk8lv\" (UniqueName: \"kubernetes.io/projected/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-kube-api-access-xk8lv\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.139088 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.141744 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.148106 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.163164 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk8lv\" (UniqueName: \"kubernetes.io/projected/c10f6dc4-f608-4960-91ab-cfdebb79b8ff-kube-api-access-xk8lv\") pod \"marketplace-operator-79b997595-kjxjv\" (UID: \"c10f6dc4-f608-4960-91ab-cfdebb79b8ff\") " pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.255965 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.264056 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.264879 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.270362 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.301193 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.307668 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.341979 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-catalog-content\") pod \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.342041 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-catalog-content\") pod \"17f131f8-064e-407f-affb-af300e3a5867\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.342088 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-utilities\") pod \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.342123 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j9fx\" (UniqueName: \"kubernetes.io/projected/2d3175d3-ec70-498c-a243-dc5ab9b1efac-kube-api-access-7j9fx\") pod \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\" (UID: \"2d3175d3-ec70-498c-a243-dc5ab9b1efac\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.342144 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-utilities\") pod \"17f131f8-064e-407f-affb-af300e3a5867\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.342188 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p97g9\" (UniqueName: \"kubernetes.io/projected/17f131f8-064e-407f-affb-af300e3a5867-kube-api-access-p97g9\") pod \"17f131f8-064e-407f-affb-af300e3a5867\" (UID: \"17f131f8-064e-407f-affb-af300e3a5867\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.347080 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-utilities" (OuterVolumeSpecName: "utilities") pod "17f131f8-064e-407f-affb-af300e3a5867" (UID: "17f131f8-064e-407f-affb-af300e3a5867"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.348597 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-utilities" (OuterVolumeSpecName: "utilities") pod "2d3175d3-ec70-498c-a243-dc5ab9b1efac" (UID: "2d3175d3-ec70-498c-a243-dc5ab9b1efac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.357988 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f131f8-064e-407f-affb-af300e3a5867-kube-api-access-p97g9" (OuterVolumeSpecName: "kube-api-access-p97g9") pod "17f131f8-064e-407f-affb-af300e3a5867" (UID: "17f131f8-064e-407f-affb-af300e3a5867"). InnerVolumeSpecName "kube-api-access-p97g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.358016 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3175d3-ec70-498c-a243-dc5ab9b1efac-kube-api-access-7j9fx" (OuterVolumeSpecName: "kube-api-access-7j9fx") pod "2d3175d3-ec70-498c-a243-dc5ab9b1efac" (UID: "2d3175d3-ec70-498c-a243-dc5ab9b1efac"). InnerVolumeSpecName "kube-api-access-7j9fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.412653 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17f131f8-064e-407f-affb-af300e3a5867" (UID: "17f131f8-064e-407f-affb-af300e3a5867"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.435934 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d3175d3-ec70-498c-a243-dc5ab9b1efac" (UID: "2d3175d3-ec70-498c-a243-dc5ab9b1efac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443759 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-868h6\" (UniqueName: \"kubernetes.io/projected/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-kube-api-access-868h6\") pod \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443810 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-catalog-content\") pod \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443845 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmb9t\" (UniqueName: \"kubernetes.io/projected/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-kube-api-access-lmb9t\") pod \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443872 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-operator-metrics\") pod \"ce159705-9661-4510-a5b8-9e7ac58e524c\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443912 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-trusted-ca\") pod \"ce159705-9661-4510-a5b8-9e7ac58e524c\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443930 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-utilities\") pod \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443950 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-utilities\") pod \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\" (UID: \"1037d4ad-c7a9-4e59-bc6e-9a26d6504430\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443964 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh69r\" (UniqueName: \"kubernetes.io/projected/ce159705-9661-4510-a5b8-9e7ac58e524c-kube-api-access-fh69r\") pod \"ce159705-9661-4510-a5b8-9e7ac58e524c\" (UID: \"ce159705-9661-4510-a5b8-9e7ac58e524c\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.443985 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-catalog-content\") pod \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\" (UID: \"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d\") " Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.444168 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.444179 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.444188 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3175d3-ec70-498c-a243-dc5ab9b1efac-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.444196 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j9fx\" (UniqueName: \"kubernetes.io/projected/2d3175d3-ec70-498c-a243-dc5ab9b1efac-kube-api-access-7j9fx\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.444206 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f131f8-064e-407f-affb-af300e3a5867-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.444214 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p97g9\" (UniqueName: \"kubernetes.io/projected/17f131f8-064e-407f-affb-af300e3a5867-kube-api-access-p97g9\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.453254 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-kube-api-access-lmb9t" (OuterVolumeSpecName: "kube-api-access-lmb9t") pod "52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" (UID: "52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d"). InnerVolumeSpecName "kube-api-access-lmb9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.456153 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ce159705-9661-4510-a5b8-9e7ac58e524c" (UID: "ce159705-9661-4510-a5b8-9e7ac58e524c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.458616 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-utilities" (OuterVolumeSpecName: "utilities") pod "1037d4ad-c7a9-4e59-bc6e-9a26d6504430" (UID: "1037d4ad-c7a9-4e59-bc6e-9a26d6504430"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.458975 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce159705-9661-4510-a5b8-9e7ac58e524c-kube-api-access-fh69r" (OuterVolumeSpecName: "kube-api-access-fh69r") pod "ce159705-9661-4510-a5b8-9e7ac58e524c" (UID: "ce159705-9661-4510-a5b8-9e7ac58e524c"). InnerVolumeSpecName "kube-api-access-fh69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.461705 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" (UID: "52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.461767 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-kube-api-access-868h6" (OuterVolumeSpecName: "kube-api-access-868h6") pod "1037d4ad-c7a9-4e59-bc6e-9a26d6504430" (UID: "1037d4ad-c7a9-4e59-bc6e-9a26d6504430"). InnerVolumeSpecName "kube-api-access-868h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.465758 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-utilities" (OuterVolumeSpecName: "utilities") pod "52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" (UID: "52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.480015 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ce159705-9661-4510-a5b8-9e7ac58e524c" (UID: "ce159705-9661-4510-a5b8-9e7ac58e524c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.508439 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6l6tw"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.543789 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1037d4ad-c7a9-4e59-bc6e-9a26d6504430" (UID: "1037d4ad-c7a9-4e59-bc6e-9a26d6504430"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552161 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552195 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce159705-9661-4510-a5b8-9e7ac58e524c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552206 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552217 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552225 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh69r\" (UniqueName: \"kubernetes.io/projected/ce159705-9661-4510-a5b8-9e7ac58e524c-kube-api-access-fh69r\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552233 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552241 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-868h6\" (UniqueName: \"kubernetes.io/projected/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-kube-api-access-868h6\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552249 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1037d4ad-c7a9-4e59-bc6e-9a26d6504430-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.552258 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmb9t\" (UniqueName: \"kubernetes.io/projected/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d-kube-api-access-lmb9t\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.715268 4669 generic.go:334] "Generic (PLEG): container finished" podID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerID="3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c" exitCode=0 Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.715319 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.715345 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" event={"ID":"ce159705-9661-4510-a5b8-9e7ac58e524c","Type":"ContainerDied","Data":"3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.715374 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4knb8" event={"ID":"ce159705-9661-4510-a5b8-9e7ac58e524c","Type":"ContainerDied","Data":"5f57d67fc8e3324e6a2d250332cadc0e68f6c38def8a19601c6eba4cc087332f"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.715393 4669 scope.go:117] "RemoveContainer" containerID="3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.717997 4669 generic.go:334] "Generic (PLEG): container finished" podID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerID="2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd" exitCode=0 Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.718134 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfsm7" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.718074 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfsm7" event={"ID":"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d","Type":"ContainerDied","Data":"2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.718438 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfsm7" event={"ID":"52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d","Type":"ContainerDied","Data":"6a7162526074761d3fb59e8d1fb9f7f6f71eef715ee393fb3f1c446ae1a0f262"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.722910 4669 generic.go:334] "Generic (PLEG): container finished" podID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerID="159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042" exitCode=0 Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.722976 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2vx" event={"ID":"1037d4ad-c7a9-4e59-bc6e-9a26d6504430","Type":"ContainerDied","Data":"159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.723004 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2vx" event={"ID":"1037d4ad-c7a9-4e59-bc6e-9a26d6504430","Type":"ContainerDied","Data":"4d764fbad688b51545884f795f61f0747a16e969d3a37565fbef530e3d0d2f40"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.723070 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2vx" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.730987 4669 scope.go:117] "RemoveContainer" containerID="3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.731476 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c\": container with ID starting with 3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c not found: ID does not exist" containerID="3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.731502 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c"} err="failed to get container status \"3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c\": rpc error: code = NotFound desc = could not find container \"3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c\": container with ID starting with 3accee55cab343bd8d2639c4c6a6b2e9090cb2929dd7919e734d1971d50e1b1c not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.731522 4669 scope.go:117] "RemoveContainer" containerID="2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.741408 4669 generic.go:334] "Generic (PLEG): container finished" podID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerID="7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb" exitCode=0 Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.741503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cmp8" event={"ID":"2d3175d3-ec70-498c-a243-dc5ab9b1efac","Type":"ContainerDied","Data":"7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.741552 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cmp8" event={"ID":"2d3175d3-ec70-498c-a243-dc5ab9b1efac","Type":"ContainerDied","Data":"a24215c3fcdcc35ac363d3057f54533a9fd3216d37c65d71f8d4cb09bb3fe1e4"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.741651 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cmp8" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.752154 4669 generic.go:334] "Generic (PLEG): container finished" podID="17f131f8-064e-407f-affb-af300e3a5867" containerID="8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587" exitCode=0 Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.752196 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7xfl" event={"ID":"17f131f8-064e-407f-affb-af300e3a5867","Type":"ContainerDied","Data":"8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.752225 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7xfl" event={"ID":"17f131f8-064e-407f-affb-af300e3a5867","Type":"ContainerDied","Data":"42fbabe0be401b59607ab124f8c49f0ec5db852c10f8f7468bae57edb9e15a93"} Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.752230 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7xfl" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.766370 4669 scope.go:117] "RemoveContainer" containerID="6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.799631 4669 scope.go:117] "RemoveContainer" containerID="b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.809456 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfsm7"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.823821 4669 scope.go:117] "RemoveContainer" containerID="2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.826384 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd\": container with ID starting with 2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd not found: ID does not exist" containerID="2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.826426 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd"} err="failed to get container status \"2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd\": rpc error: code = NotFound desc = could not find container \"2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd\": container with ID starting with 2c30766b7646690aaee1c35a5fdecd3a86d3887bbdbade6e11b7359d297578dd not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.826457 4669 scope.go:117] "RemoveContainer" containerID="6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.829484 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345\": container with ID starting with 6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345 not found: ID does not exist" containerID="6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.829511 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345"} err="failed to get container status \"6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345\": rpc error: code = NotFound desc = could not find container \"6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345\": container with ID starting with 6b44741354f0c80b6a60f217cc9575726b13655bd75fecfc3bbe0fa8060e0345 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.829547 4669 scope.go:117] "RemoveContainer" containerID="b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.831728 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1\": container with ID starting with b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1 not found: ID does not exist" containerID="b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.831772 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1"} err="failed to get container status \"b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1\": rpc error: code = NotFound desc = could not find container \"b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1\": container with ID starting with b0400053334477d368e38aaa1eb8dd85cabc7579f786b8ddd5728c7df80a0cb1 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.831806 4669 scope.go:117] "RemoveContainer" containerID="159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.841345 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfsm7"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.850914 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4knb8"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.866634 4669 scope.go:117] "RemoveContainer" containerID="ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.866828 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4knb8"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.877753 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tb2vx"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.885635 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tb2vx"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.895470 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kjxjv"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.895585 4669 scope.go:117] "RemoveContainer" containerID="38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.898651 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cmp8"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.902305 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8cmp8"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.906176 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s7xfl"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.909586 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s7xfl"] Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.923372 4669 scope.go:117] "RemoveContainer" containerID="159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.923716 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042\": container with ID starting with 159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042 not found: ID does not exist" containerID="159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.923747 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042"} err="failed to get container status \"159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042\": rpc error: code = NotFound desc = could not find container \"159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042\": container with ID starting with 159eb6815c91b4181411eefe739d71ce23912d96d70a697a00525305fef14042 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.923769 4669 scope.go:117] "RemoveContainer" containerID="ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.924921 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5\": container with ID starting with ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5 not found: ID does not exist" containerID="ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.924983 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5"} err="failed to get container status \"ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5\": rpc error: code = NotFound desc = could not find container \"ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5\": container with ID starting with ea8fd041e60c8057a5d97fcfb13ad74ac4c47bd3f239ba7d61cea90adaa3e5d5 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.925011 4669 scope.go:117] "RemoveContainer" containerID="38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.925307 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645\": container with ID starting with 38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645 not found: ID does not exist" containerID="38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.925420 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645"} err="failed to get container status \"38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645\": rpc error: code = NotFound desc = could not find container \"38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645\": container with ID starting with 38d53ad57489a731bdfaac889a252154f4a96ee9679c98c8fb7aebadaa151645 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.925490 4669 scope.go:117] "RemoveContainer" containerID="7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.937718 4669 scope.go:117] "RemoveContainer" containerID="9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.953726 4669 scope.go:117] "RemoveContainer" containerID="7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.967681 4669 scope.go:117] "RemoveContainer" containerID="7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.968053 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb\": container with ID starting with 7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb not found: ID does not exist" containerID="7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.968086 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb"} err="failed to get container status \"7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb\": rpc error: code = NotFound desc = could not find container \"7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb\": container with ID starting with 7238142641a6ed40993721b66078a2ec814b4f7cb8fb30c182e18f8628f514cb not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.968109 4669 scope.go:117] "RemoveContainer" containerID="9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.968363 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7\": container with ID starting with 9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7 not found: ID does not exist" containerID="9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.968396 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7"} err="failed to get container status \"9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7\": rpc error: code = NotFound desc = could not find container \"9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7\": container with ID starting with 9b25b2e0113229652e783512187d9dbc4de5b8a460cf602a8d127eabe6b3e5a7 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.968418 4669 scope.go:117] "RemoveContainer" containerID="7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25" Oct 08 20:48:20 crc kubenswrapper[4669]: E1008 20:48:20.968708 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25\": container with ID starting with 7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25 not found: ID does not exist" containerID="7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.968737 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25"} err="failed to get container status \"7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25\": rpc error: code = NotFound desc = could not find container \"7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25\": container with ID starting with 7d2323fcc94e21d620ea1f1a9854eef4509f78a423e8caa5b88a12e1aa619c25 not found: ID does not exist" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.968759 4669 scope.go:117] "RemoveContainer" containerID="8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587" Oct 08 20:48:20 crc kubenswrapper[4669]: I1008 20:48:20.986138 4669 scope.go:117] "RemoveContainer" containerID="5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.029479 4669 scope.go:117] "RemoveContainer" containerID="4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.041992 4669 scope.go:117] "RemoveContainer" containerID="8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587" Oct 08 20:48:21 crc kubenswrapper[4669]: E1008 20:48:21.042361 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587\": container with ID starting with 8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587 not found: ID does not exist" containerID="8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.042411 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587"} err="failed to get container status \"8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587\": rpc error: code = NotFound desc = could not find container \"8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587\": container with ID starting with 8e46faafb5b538a65e2dedca8bb4062be2962d7309d8ab78fe68774a812cf587 not found: ID does not exist" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.042434 4669 scope.go:117] "RemoveContainer" containerID="5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08" Oct 08 20:48:21 crc kubenswrapper[4669]: E1008 20:48:21.042846 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08\": container with ID starting with 5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08 not found: ID does not exist" containerID="5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.042885 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08"} err="failed to get container status \"5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08\": rpc error: code = NotFound desc = could not find container \"5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08\": container with ID starting with 5dd3c683568b81cff55f6a1e9f05b632afbba33b1eae240969a5271768ee8f08 not found: ID does not exist" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.042898 4669 scope.go:117] "RemoveContainer" containerID="4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5" Oct 08 20:48:21 crc kubenswrapper[4669]: E1008 20:48:21.043118 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5\": container with ID starting with 4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5 not found: ID does not exist" containerID="4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.043157 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5"} err="failed to get container status \"4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5\": rpc error: code = NotFound desc = could not find container \"4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5\": container with ID starting with 4a601df470453269dd05a466589a55149e00cec3de52927062290c449202f9e5 not found: ID does not exist" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.337426 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" path="/var/lib/kubelet/pods/1037d4ad-c7a9-4e59-bc6e-9a26d6504430/volumes" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.338015 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f131f8-064e-407f-affb-af300e3a5867" path="/var/lib/kubelet/pods/17f131f8-064e-407f-affb-af300e3a5867/volumes" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.338793 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" path="/var/lib/kubelet/pods/2d3175d3-ec70-498c-a243-dc5ab9b1efac/volumes" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.340102 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" path="/var/lib/kubelet/pods/52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d/volumes" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.340716 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" path="/var/lib/kubelet/pods/ce159705-9661-4510-a5b8-9e7ac58e524c/volumes" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.764046 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" event={"ID":"c10f6dc4-f608-4960-91ab-cfdebb79b8ff","Type":"ContainerStarted","Data":"f959adb8da674a1ced78e9a0c0b9424063a39adde2c5dfaff05b07f553bab222"} Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.764413 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" event={"ID":"c10f6dc4-f608-4960-91ab-cfdebb79b8ff","Type":"ContainerStarted","Data":"e64126f5e71aa4cfb2895fb57b34030b2287a9fa4cb525aa53fde3e632919739"} Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.764827 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.768496 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" Oct 08 20:48:21 crc kubenswrapper[4669]: I1008 20:48:21.781883 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kjxjv" podStartSLOduration=2.781864262 podStartE2EDuration="2.781864262s" podCreationTimestamp="2025-10-08 20:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:48:21.780513294 +0000 UTC m=+221.473323967" watchObservedRunningTime="2025-10-08 20:48:21.781864262 +0000 UTC m=+221.474674935" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605256 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jxfn2"] Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605583 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605604 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605626 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605639 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605659 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605673 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605690 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605703 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605725 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605737 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605752 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605764 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605781 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605794 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605811 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605824 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605846 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605858 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605876 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605888 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605905 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605916 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="extract-utilities" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605932 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605944 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="extract-content" Oct 08 20:48:22 crc kubenswrapper[4669]: E1008 20:48:22.605961 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.605972 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.606149 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f131f8-064e-407f-affb-af300e3a5867" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.606177 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3175d3-ec70-498c-a243-dc5ab9b1efac" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.606198 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e7ead4-34f5-4b4a-8a18-bde6bd3cd62d" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.606225 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce159705-9661-4510-a5b8-9e7ac58e524c" containerName="marketplace-operator" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.606242 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1037d4ad-c7a9-4e59-bc6e-9a26d6504430" containerName="registry-server" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.607454 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.611206 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.613121 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxfn2"] Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.679823 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg98v\" (UniqueName: \"kubernetes.io/projected/250ce6ef-e5b8-4912-8037-85a0c520ff7a-kube-api-access-hg98v\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.679882 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ce6ef-e5b8-4912-8037-85a0c520ff7a-catalog-content\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.680090 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ce6ef-e5b8-4912-8037-85a0c520ff7a-utilities\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.781931 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg98v\" (UniqueName: \"kubernetes.io/projected/250ce6ef-e5b8-4912-8037-85a0c520ff7a-kube-api-access-hg98v\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.781993 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ce6ef-e5b8-4912-8037-85a0c520ff7a-catalog-content\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.782021 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ce6ef-e5b8-4912-8037-85a0c520ff7a-utilities\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.782433 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ce6ef-e5b8-4912-8037-85a0c520ff7a-catalog-content\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.782460 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ce6ef-e5b8-4912-8037-85a0c520ff7a-utilities\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.802368 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hns7h"] Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.820574 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.820681 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg98v\" (UniqueName: \"kubernetes.io/projected/250ce6ef-e5b8-4912-8037-85a0c520ff7a-kube-api-access-hg98v\") pod \"community-operators-jxfn2\" (UID: \"250ce6ef-e5b8-4912-8037-85a0c520ff7a\") " pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.823678 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hns7h"] Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.823718 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.935808 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.984977 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9xf\" (UniqueName: \"kubernetes.io/projected/e6ded9fb-15c9-44ee-a538-0b31da1b016a-kube-api-access-zh9xf\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.985062 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-utilities\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:22 crc kubenswrapper[4669]: I1008 20:48:22.985245 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-catalog-content\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.086279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9xf\" (UniqueName: \"kubernetes.io/projected/e6ded9fb-15c9-44ee-a538-0b31da1b016a-kube-api-access-zh9xf\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.086373 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-utilities\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.086406 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-catalog-content\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.087279 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-catalog-content\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.088248 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-utilities\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.104554 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9xf\" (UniqueName: \"kubernetes.io/projected/e6ded9fb-15c9-44ee-a538-0b31da1b016a-kube-api-access-zh9xf\") pod \"certified-operators-hns7h\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.155580 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.157396 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxfn2"] Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.356630 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hns7h"] Oct 08 20:48:23 crc kubenswrapper[4669]: W1008 20:48:23.375053 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ded9fb_15c9_44ee_a538_0b31da1b016a.slice/crio-9044e4a4ff026452f6b299054e9331dc68b1b4ac810c8ac474aff9612acbaa9a WatchSource:0}: Error finding container 9044e4a4ff026452f6b299054e9331dc68b1b4ac810c8ac474aff9612acbaa9a: Status 404 returned error can't find the container with id 9044e4a4ff026452f6b299054e9331dc68b1b4ac810c8ac474aff9612acbaa9a Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.777544 4669 generic.go:334] "Generic (PLEG): container finished" podID="250ce6ef-e5b8-4912-8037-85a0c520ff7a" containerID="bb16d474bc731c3e6e53219e280dec01e8ae2f762f8230a44e44fb22499057fe" exitCode=0 Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.777632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxfn2" event={"ID":"250ce6ef-e5b8-4912-8037-85a0c520ff7a","Type":"ContainerDied","Data":"bb16d474bc731c3e6e53219e280dec01e8ae2f762f8230a44e44fb22499057fe"} Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.777662 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxfn2" event={"ID":"250ce6ef-e5b8-4912-8037-85a0c520ff7a","Type":"ContainerStarted","Data":"34b7c90c4000341b5fc430ffbf81df4abd54cc211b664285319d92d5544df79c"} Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.781997 4669 generic.go:334] "Generic (PLEG): container finished" podID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerID="cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee" exitCode=0 Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.782144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hns7h" event={"ID":"e6ded9fb-15c9-44ee-a538-0b31da1b016a","Type":"ContainerDied","Data":"cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee"} Oct 08 20:48:23 crc kubenswrapper[4669]: I1008 20:48:23.782252 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hns7h" event={"ID":"e6ded9fb-15c9-44ee-a538-0b31da1b016a","Type":"ContainerStarted","Data":"9044e4a4ff026452f6b299054e9331dc68b1b4ac810c8ac474aff9612acbaa9a"} Oct 08 20:48:24 crc kubenswrapper[4669]: I1008 20:48:24.793609 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxfn2" event={"ID":"250ce6ef-e5b8-4912-8037-85a0c520ff7a","Type":"ContainerStarted","Data":"d8c3622ff81f10bd44d4c8c39d0e3fa8fcfdfe271c5f429301059a130f13509b"} Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.000439 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4rszc"] Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.001418 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.005001 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.010924 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4rszc"] Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.115936 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jk5\" (UniqueName: \"kubernetes.io/projected/bca52d11-f041-40bc-b352-1e820a410996-kube-api-access-p8jk5\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.116276 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca52d11-f041-40bc-b352-1e820a410996-catalog-content\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.116312 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca52d11-f041-40bc-b352-1e820a410996-utilities\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.201797 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-558fm"] Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.202769 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.205109 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.212224 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-558fm"] Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.217096 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca52d11-f041-40bc-b352-1e820a410996-catalog-content\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.217155 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca52d11-f041-40bc-b352-1e820a410996-utilities\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.217194 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jk5\" (UniqueName: \"kubernetes.io/projected/bca52d11-f041-40bc-b352-1e820a410996-kube-api-access-p8jk5\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.217642 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca52d11-f041-40bc-b352-1e820a410996-catalog-content\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.217648 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca52d11-f041-40bc-b352-1e820a410996-utilities\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.239465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jk5\" (UniqueName: \"kubernetes.io/projected/bca52d11-f041-40bc-b352-1e820a410996-kube-api-access-p8jk5\") pod \"redhat-operators-4rszc\" (UID: \"bca52d11-f041-40bc-b352-1e820a410996\") " pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.318642 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065b510c-2a5d-4110-80b9-69865b532686-utilities\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.318728 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bd7p\" (UniqueName: \"kubernetes.io/projected/065b510c-2a5d-4110-80b9-69865b532686-kube-api-access-4bd7p\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.318882 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.318998 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065b510c-2a5d-4110-80b9-69865b532686-catalog-content\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.420387 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bd7p\" (UniqueName: \"kubernetes.io/projected/065b510c-2a5d-4110-80b9-69865b532686-kube-api-access-4bd7p\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.422454 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065b510c-2a5d-4110-80b9-69865b532686-catalog-content\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.422909 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/065b510c-2a5d-4110-80b9-69865b532686-catalog-content\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.424251 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065b510c-2a5d-4110-80b9-69865b532686-utilities\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.424327 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/065b510c-2a5d-4110-80b9-69865b532686-utilities\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.436874 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bd7p\" (UniqueName: \"kubernetes.io/projected/065b510c-2a5d-4110-80b9-69865b532686-kube-api-access-4bd7p\") pod \"redhat-marketplace-558fm\" (UID: \"065b510c-2a5d-4110-80b9-69865b532686\") " pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.500718 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4rszc"] Oct 08 20:48:25 crc kubenswrapper[4669]: W1008 20:48:25.508738 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbca52d11_f041_40bc_b352_1e820a410996.slice/crio-ea6987c3f2049d62150eb345db35f774fdc468399b0da27ef2757be4ff5f1332 WatchSource:0}: Error finding container ea6987c3f2049d62150eb345db35f774fdc468399b0da27ef2757be4ff5f1332: Status 404 returned error can't find the container with id ea6987c3f2049d62150eb345db35f774fdc468399b0da27ef2757be4ff5f1332 Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.524384 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.698962 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-558fm"] Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.801331 4669 generic.go:334] "Generic (PLEG): container finished" podID="bca52d11-f041-40bc-b352-1e820a410996" containerID="16e4a653a7479925b55043073a6f3a7236e75bf4b87c454aa7dc186be2b355ac" exitCode=0 Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.801398 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rszc" event={"ID":"bca52d11-f041-40bc-b352-1e820a410996","Type":"ContainerDied","Data":"16e4a653a7479925b55043073a6f3a7236e75bf4b87c454aa7dc186be2b355ac"} Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.801422 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rszc" event={"ID":"bca52d11-f041-40bc-b352-1e820a410996","Type":"ContainerStarted","Data":"ea6987c3f2049d62150eb345db35f774fdc468399b0da27ef2757be4ff5f1332"} Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.810497 4669 generic.go:334] "Generic (PLEG): container finished" podID="250ce6ef-e5b8-4912-8037-85a0c520ff7a" containerID="d8c3622ff81f10bd44d4c8c39d0e3fa8fcfdfe271c5f429301059a130f13509b" exitCode=0 Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.810671 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxfn2" event={"ID":"250ce6ef-e5b8-4912-8037-85a0c520ff7a","Type":"ContainerDied","Data":"d8c3622ff81f10bd44d4c8c39d0e3fa8fcfdfe271c5f429301059a130f13509b"} Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.824430 4669 generic.go:334] "Generic (PLEG): container finished" podID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerID="252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801" exitCode=0 Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.824567 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hns7h" event={"ID":"e6ded9fb-15c9-44ee-a538-0b31da1b016a","Type":"ContainerDied","Data":"252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801"} Oct 08 20:48:25 crc kubenswrapper[4669]: I1008 20:48:25.828129 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558fm" event={"ID":"065b510c-2a5d-4110-80b9-69865b532686","Type":"ContainerStarted","Data":"b49a47c03f6b303eb143383c8331a1d3621e6b40db4b14347ccf96ddacaf5479"} Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.835430 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hns7h" event={"ID":"e6ded9fb-15c9-44ee-a538-0b31da1b016a","Type":"ContainerStarted","Data":"5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf"} Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.837337 4669 generic.go:334] "Generic (PLEG): container finished" podID="065b510c-2a5d-4110-80b9-69865b532686" containerID="92d938bc65a1e20914b5f48d4ef57dd041a42192fb19907014b46857566d723a" exitCode=0 Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.837422 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558fm" event={"ID":"065b510c-2a5d-4110-80b9-69865b532686","Type":"ContainerDied","Data":"92d938bc65a1e20914b5f48d4ef57dd041a42192fb19907014b46857566d723a"} Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.839673 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rszc" event={"ID":"bca52d11-f041-40bc-b352-1e820a410996","Type":"ContainerStarted","Data":"50c1f8db759deed45ae8568d12b1f019caa7584cb5edfbe6e777d108c6e87e67"} Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.843137 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxfn2" event={"ID":"250ce6ef-e5b8-4912-8037-85a0c520ff7a","Type":"ContainerStarted","Data":"b0802499defccb4a052bd5358f5a297d3f31e826e999e5ea4a892b971e35e996"} Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.857421 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hns7h" podStartSLOduration=2.373640371 podStartE2EDuration="4.85739756s" podCreationTimestamp="2025-10-08 20:48:22 +0000 UTC" firstStartedPulling="2025-10-08 20:48:23.783429225 +0000 UTC m=+223.476239938" lastFinishedPulling="2025-10-08 20:48:26.267186454 +0000 UTC m=+225.959997127" observedRunningTime="2025-10-08 20:48:26.856181206 +0000 UTC m=+226.548991889" watchObservedRunningTime="2025-10-08 20:48:26.85739756 +0000 UTC m=+226.550208233" Oct 08 20:48:26 crc kubenswrapper[4669]: I1008 20:48:26.870571 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jxfn2" podStartSLOduration=2.3264853580000002 podStartE2EDuration="4.870553969s" podCreationTimestamp="2025-10-08 20:48:22 +0000 UTC" firstStartedPulling="2025-10-08 20:48:23.779616281 +0000 UTC m=+223.472426994" lastFinishedPulling="2025-10-08 20:48:26.323684932 +0000 UTC m=+226.016495605" observedRunningTime="2025-10-08 20:48:26.870046976 +0000 UTC m=+226.562857669" watchObservedRunningTime="2025-10-08 20:48:26.870553969 +0000 UTC m=+226.563364642" Oct 08 20:48:27 crc kubenswrapper[4669]: I1008 20:48:27.850461 4669 generic.go:334] "Generic (PLEG): container finished" podID="bca52d11-f041-40bc-b352-1e820a410996" containerID="50c1f8db759deed45ae8568d12b1f019caa7584cb5edfbe6e777d108c6e87e67" exitCode=0 Oct 08 20:48:27 crc kubenswrapper[4669]: I1008 20:48:27.850542 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rszc" event={"ID":"bca52d11-f041-40bc-b352-1e820a410996","Type":"ContainerDied","Data":"50c1f8db759deed45ae8568d12b1f019caa7584cb5edfbe6e777d108c6e87e67"} Oct 08 20:48:28 crc kubenswrapper[4669]: I1008 20:48:28.863328 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558fm" event={"ID":"065b510c-2a5d-4110-80b9-69865b532686","Type":"ContainerStarted","Data":"f3f651a05b066307fbd5ac2a6f0d1720fc92c82f8bc527c150bd34c8598c5a0a"} Oct 08 20:48:28 crc kubenswrapper[4669]: I1008 20:48:28.866180 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rszc" event={"ID":"bca52d11-f041-40bc-b352-1e820a410996","Type":"ContainerStarted","Data":"85355b4113fd6599e5daf391cfea20a1415b1a782db91b491767dfed9bf41971"} Oct 08 20:48:28 crc kubenswrapper[4669]: I1008 20:48:28.915331 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4rszc" podStartSLOduration=2.319801485 podStartE2EDuration="4.915313767s" podCreationTimestamp="2025-10-08 20:48:24 +0000 UTC" firstStartedPulling="2025-10-08 20:48:25.803238899 +0000 UTC m=+225.496049572" lastFinishedPulling="2025-10-08 20:48:28.398751141 +0000 UTC m=+228.091561854" observedRunningTime="2025-10-08 20:48:28.912187692 +0000 UTC m=+228.604998365" watchObservedRunningTime="2025-10-08 20:48:28.915313767 +0000 UTC m=+228.608124440" Oct 08 20:48:29 crc kubenswrapper[4669]: I1008 20:48:29.875144 4669 generic.go:334] "Generic (PLEG): container finished" podID="065b510c-2a5d-4110-80b9-69865b532686" containerID="f3f651a05b066307fbd5ac2a6f0d1720fc92c82f8bc527c150bd34c8598c5a0a" exitCode=0 Oct 08 20:48:29 crc kubenswrapper[4669]: I1008 20:48:29.875226 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558fm" event={"ID":"065b510c-2a5d-4110-80b9-69865b532686","Type":"ContainerDied","Data":"f3f651a05b066307fbd5ac2a6f0d1720fc92c82f8bc527c150bd34c8598c5a0a"} Oct 08 20:48:30 crc kubenswrapper[4669]: I1008 20:48:30.883194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558fm" event={"ID":"065b510c-2a5d-4110-80b9-69865b532686","Type":"ContainerStarted","Data":"9a5f18688b1265b7466122fd5b36d7cd61bd9763e6e03506b39abe11b189641e"} Oct 08 20:48:30 crc kubenswrapper[4669]: I1008 20:48:30.908334 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-558fm" podStartSLOduration=2.43097373 podStartE2EDuration="5.908305187s" podCreationTimestamp="2025-10-08 20:48:25 +0000 UTC" firstStartedPulling="2025-10-08 20:48:26.838467111 +0000 UTC m=+226.531277784" lastFinishedPulling="2025-10-08 20:48:30.315798568 +0000 UTC m=+230.008609241" observedRunningTime="2025-10-08 20:48:30.906047945 +0000 UTC m=+230.598858608" watchObservedRunningTime="2025-10-08 20:48:30.908305187 +0000 UTC m=+230.601115850" Oct 08 20:48:31 crc kubenswrapper[4669]: I1008 20:48:31.181917 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-t9x9z" Oct 08 20:48:31 crc kubenswrapper[4669]: I1008 20:48:31.224111 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm28n"] Oct 08 20:48:32 crc kubenswrapper[4669]: I1008 20:48:32.935979 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:32 crc kubenswrapper[4669]: I1008 20:48:32.936273 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:32 crc kubenswrapper[4669]: I1008 20:48:32.995879 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:33 crc kubenswrapper[4669]: I1008 20:48:33.156687 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:33 crc kubenswrapper[4669]: I1008 20:48:33.157133 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:33 crc kubenswrapper[4669]: I1008 20:48:33.199184 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:33 crc kubenswrapper[4669]: I1008 20:48:33.934986 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 20:48:33 crc kubenswrapper[4669]: I1008 20:48:33.938787 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jxfn2" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.319179 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.319250 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.359783 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.525090 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.525492 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.561941 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.955232 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4rszc" Oct 08 20:48:35 crc kubenswrapper[4669]: I1008 20:48:35.964881 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-558fm" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.553345 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" podUID="5289f930-ba47-4745-8ab7-784863dc110e" containerName="oauth-openshift" containerID="cri-o://b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484" gracePeriod=15 Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.939017 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.969231 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m"] Oct 08 20:48:45 crc kubenswrapper[4669]: E1008 20:48:45.969466 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5289f930-ba47-4745-8ab7-784863dc110e" containerName="oauth-openshift" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.969478 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="5289f930-ba47-4745-8ab7-784863dc110e" containerName="oauth-openshift" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.969636 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="5289f930-ba47-4745-8ab7-784863dc110e" containerName="oauth-openshift" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.970076 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.981108 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m"] Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.985211 4669 generic.go:334] "Generic (PLEG): container finished" podID="5289f930-ba47-4745-8ab7-784863dc110e" containerID="b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484" exitCode=0 Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.985271 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" event={"ID":"5289f930-ba47-4745-8ab7-784863dc110e","Type":"ContainerDied","Data":"b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484"} Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.985305 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" event={"ID":"5289f930-ba47-4745-8ab7-784863dc110e","Type":"ContainerDied","Data":"1b13e590978d685b4a57d0fa51003545e7f72a614240ecc6ba07cee91c4d21c7"} Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.985325 4669 scope.go:117] "RemoveContainer" containerID="b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484" Oct 08 20:48:45 crc kubenswrapper[4669]: I1008 20:48:45.985480 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6l6tw" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.010758 4669 scope.go:117] "RemoveContainer" containerID="b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484" Oct 08 20:48:46 crc kubenswrapper[4669]: E1008 20:48:46.012055 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484\": container with ID starting with b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484 not found: ID does not exist" containerID="b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.012133 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484"} err="failed to get container status \"b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484\": rpc error: code = NotFound desc = could not find container \"b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484\": container with ID starting with b51f0a2c76400e6054e82a7c4a383c2e68f798c0a23adf1557b9d6205b559484 not found: ID does not exist" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.140958 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-login\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141017 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-ocp-branding-template\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141045 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-service-ca\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141071 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-session\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141102 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhkjw\" (UniqueName: \"kubernetes.io/projected/5289f930-ba47-4745-8ab7-784863dc110e-kube-api-access-bhkjw\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141135 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-idp-0-file-data\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141169 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-router-certs\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141198 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-cliconfig\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141217 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-error\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141263 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-provider-selection\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141294 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-audit-policies\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141348 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-trusted-ca-bundle\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141377 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-serving-cert\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141407 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5289f930-ba47-4745-8ab7-784863dc110e-audit-dir\") pod \"5289f930-ba47-4745-8ab7-784863dc110e\" (UID: \"5289f930-ba47-4745-8ab7-784863dc110e\") " Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141596 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141633 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqdf\" (UniqueName: \"kubernetes.io/projected/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-kube-api-access-bfqdf\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141658 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141687 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141712 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141741 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141770 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141826 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141870 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141908 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141936 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141960 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.141992 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.143970 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5289f930-ba47-4745-8ab7-784863dc110e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.144615 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.144596 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.145272 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.145334 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.152146 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5289f930-ba47-4745-8ab7-784863dc110e-kube-api-access-bhkjw" (OuterVolumeSpecName: "kube-api-access-bhkjw") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "kube-api-access-bhkjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.152467 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.152620 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.152954 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.154447 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.154866 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.155101 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.159107 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.163028 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5289f930-ba47-4745-8ab7-784863dc110e" (UID: "5289f930-ba47-4745-8ab7-784863dc110e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqdf\" (UniqueName: \"kubernetes.io/projected/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-kube-api-access-bfqdf\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242877 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242907 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242923 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242945 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242963 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.242984 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243003 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243031 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243058 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243077 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243095 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243117 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243138 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243178 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243188 4669 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243200 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243209 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243220 4669 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5289f930-ba47-4745-8ab7-784863dc110e-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243229 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243238 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243249 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243260 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243271 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhkjw\" (UniqueName: \"kubernetes.io/projected/5289f930-ba47-4745-8ab7-784863dc110e-kube-api-access-bhkjw\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243282 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243295 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243307 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.243315 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5289f930-ba47-4745-8ab7-784863dc110e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.244272 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.245035 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.246122 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.246140 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.246691 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.248189 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.248388 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.248781 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.251376 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.252276 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.254248 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.254658 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.254897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.264434 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqdf\" (UniqueName: \"kubernetes.io/projected/f5c2b838-e82f-4c08-bd56-2d2b7a17143f-kube-api-access-bfqdf\") pod \"oauth-openshift-6c8d5d4f46-kbj8m\" (UID: \"f5c2b838-e82f-4c08-bd56-2d2b7a17143f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.304759 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.311954 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6l6tw"] Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.320128 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6l6tw"] Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.515635 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m"] Oct 08 20:48:46 crc kubenswrapper[4669]: W1008 20:48:46.520438 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c2b838_e82f_4c08_bd56_2d2b7a17143f.slice/crio-b1a0639d5ee5fcfce0c7f1f5d6df1badb876a8ace3e35c5d9745b9163c29602b WatchSource:0}: Error finding container b1a0639d5ee5fcfce0c7f1f5d6df1badb876a8ace3e35c5d9745b9163c29602b: Status 404 returned error can't find the container with id b1a0639d5ee5fcfce0c7f1f5d6df1badb876a8ace3e35c5d9745b9163c29602b Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.993059 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" event={"ID":"f5c2b838-e82f-4c08-bd56-2d2b7a17143f","Type":"ContainerStarted","Data":"d80562247abc65eb29c19ca56d36bc4856afa4bf66ca286b4be5e7aa5981f6ce"} Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.993435 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:46 crc kubenswrapper[4669]: I1008 20:48:46.993448 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" event={"ID":"f5c2b838-e82f-4c08-bd56-2d2b7a17143f","Type":"ContainerStarted","Data":"b1a0639d5ee5fcfce0c7f1f5d6df1badb876a8ace3e35c5d9745b9163c29602b"} Oct 08 20:48:47 crc kubenswrapper[4669]: I1008 20:48:47.011737 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" podStartSLOduration=27.01172143 podStartE2EDuration="27.01172143s" podCreationTimestamp="2025-10-08 20:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:48:47.00919946 +0000 UTC m=+246.702010123" watchObservedRunningTime="2025-10-08 20:48:47.01172143 +0000 UTC m=+246.704532103" Oct 08 20:48:47 crc kubenswrapper[4669]: I1008 20:48:47.222634 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-kbj8m" Oct 08 20:48:47 crc kubenswrapper[4669]: I1008 20:48:47.337626 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5289f930-ba47-4745-8ab7-784863dc110e" path="/var/lib/kubelet/pods/5289f930-ba47-4745-8ab7-784863dc110e/volumes" Oct 08 20:48:56 crc kubenswrapper[4669]: I1008 20:48:56.266149 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" podUID="64c20dfc-09aa-4096-b7c1-7233d0a18a17" containerName="registry" containerID="cri-o://edcbee96cdb7309ae439366725b32d7ccf74d2ac6eef89b7bf260c961d23cac7" gracePeriod=30 Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.073583 4669 generic.go:334] "Generic (PLEG): container finished" podID="64c20dfc-09aa-4096-b7c1-7233d0a18a17" containerID="edcbee96cdb7309ae439366725b32d7ccf74d2ac6eef89b7bf260c961d23cac7" exitCode=0 Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.073689 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" event={"ID":"64c20dfc-09aa-4096-b7c1-7233d0a18a17","Type":"ContainerDied","Data":"edcbee96cdb7309ae439366725b32d7ccf74d2ac6eef89b7bf260c961d23cac7"} Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.131153 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.302847 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.305675 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-certificates\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.305807 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-trusted-ca\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.305964 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-tls\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.306053 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-bound-sa-token\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.306149 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkn2p\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-kube-api-access-rkn2p\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.306305 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64c20dfc-09aa-4096-b7c1-7233d0a18a17-ca-trust-extracted\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.306478 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64c20dfc-09aa-4096-b7c1-7233d0a18a17-installation-pull-secrets\") pod \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\" (UID: \"64c20dfc-09aa-4096-b7c1-7233d0a18a17\") " Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.308206 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.308930 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.313329 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.313692 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c20dfc-09aa-4096-b7c1-7233d0a18a17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.316686 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.317986 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.320583 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-kube-api-access-rkn2p" (OuterVolumeSpecName: "kube-api-access-rkn2p") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "kube-api-access-rkn2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.335417 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c20dfc-09aa-4096-b7c1-7233d0a18a17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "64c20dfc-09aa-4096-b7c1-7233d0a18a17" (UID: "64c20dfc-09aa-4096-b7c1-7233d0a18a17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408349 4669 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64c20dfc-09aa-4096-b7c1-7233d0a18a17-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408391 4669 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408406 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c20dfc-09aa-4096-b7c1-7233d0a18a17-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408418 4669 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408429 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408440 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkn2p\" (UniqueName: \"kubernetes.io/projected/64c20dfc-09aa-4096-b7c1-7233d0a18a17-kube-api-access-rkn2p\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:57 crc kubenswrapper[4669]: I1008 20:48:57.408452 4669 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64c20dfc-09aa-4096-b7c1-7233d0a18a17-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 20:48:58 crc kubenswrapper[4669]: I1008 20:48:58.079791 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" event={"ID":"64c20dfc-09aa-4096-b7c1-7233d0a18a17","Type":"ContainerDied","Data":"2b3ccf8c73936ec642976dc5d9176921a56604297133d350444e27ea971c0c53"} Oct 08 20:48:58 crc kubenswrapper[4669]: I1008 20:48:58.079849 4669 scope.go:117] "RemoveContainer" containerID="edcbee96cdb7309ae439366725b32d7ccf74d2ac6eef89b7bf260c961d23cac7" Oct 08 20:48:58 crc kubenswrapper[4669]: I1008 20:48:58.079958 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm28n" Oct 08 20:48:58 crc kubenswrapper[4669]: I1008 20:48:58.105008 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm28n"] Oct 08 20:48:58 crc kubenswrapper[4669]: I1008 20:48:58.108022 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm28n"] Oct 08 20:48:59 crc kubenswrapper[4669]: I1008 20:48:59.337669 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c20dfc-09aa-4096-b7c1-7233d0a18a17" path="/var/lib/kubelet/pods/64c20dfc-09aa-4096-b7c1-7233d0a18a17/volumes" Oct 08 20:50:13 crc kubenswrapper[4669]: I1008 20:50:13.185459 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:50:13 crc kubenswrapper[4669]: I1008 20:50:13.186069 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:50:43 crc kubenswrapper[4669]: I1008 20:50:43.185290 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:50:43 crc kubenswrapper[4669]: I1008 20:50:43.185921 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.185575 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.186270 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.186461 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.187340 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24bba86ffc0208cf280d07da074223933749a4ce672ce4bf5741aa000c003ce6"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.187426 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://24bba86ffc0208cf280d07da074223933749a4ce672ce4bf5741aa000c003ce6" gracePeriod=600 Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.868174 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="24bba86ffc0208cf280d07da074223933749a4ce672ce4bf5741aa000c003ce6" exitCode=0 Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.868330 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"24bba86ffc0208cf280d07da074223933749a4ce672ce4bf5741aa000c003ce6"} Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.869252 4669 scope.go:117] "RemoveContainer" containerID="9e1bd09b1fcc78173d03292522a284e68e59f374def13fd6830f24a31e1138c5" Oct 08 20:51:13 crc kubenswrapper[4669]: I1008 20:51:13.868859 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"c6d89efc3b8d912824669f2434ad38318f78ba91caa1db76769d7947e2583b0f"} Oct 08 20:53:13 crc kubenswrapper[4669]: I1008 20:53:13.189883 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:53:13 crc kubenswrapper[4669]: I1008 20:53:13.190692 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:53:43 crc kubenswrapper[4669]: I1008 20:53:43.185557 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:53:43 crc kubenswrapper[4669]: I1008 20:53:43.186358 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.185370 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.187054 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.187187 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.187900 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6d89efc3b8d912824669f2434ad38318f78ba91caa1db76769d7947e2583b0f"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.188041 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://c6d89efc3b8d912824669f2434ad38318f78ba91caa1db76769d7947e2583b0f" gracePeriod=600 Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.680672 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-7g9zf"] Oct 08 20:54:13 crc kubenswrapper[4669]: E1008 20:54:13.681374 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c20dfc-09aa-4096-b7c1-7233d0a18a17" containerName="registry" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.681389 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c20dfc-09aa-4096-b7c1-7233d0a18a17" containerName="registry" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.681498 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c20dfc-09aa-4096-b7c1-7233d0a18a17" containerName="registry" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.681956 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.685892 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.686078 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.686251 4669 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z6gf8" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.690051 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-7g9zf"] Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.695803 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-btc2c"] Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.696702 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-btc2c" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.703388 4669 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7zzdn" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.707964 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-btc2c"] Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.715510 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9flwc"] Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.716114 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.718048 4669 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-k6qpg" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.734849 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9flwc"] Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.808023 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smks2\" (UniqueName: \"kubernetes.io/projected/7a8b2c5d-38a2-4866-a1e0-8df9b4659c15-kube-api-access-smks2\") pod \"cert-manager-5b446d88c5-btc2c\" (UID: \"7a8b2c5d-38a2-4866-a1e0-8df9b4659c15\") " pod="cert-manager/cert-manager-5b446d88c5-btc2c" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.808085 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzm5g\" (UniqueName: \"kubernetes.io/projected/c726284c-5ca2-4d63-b96e-56c1aa537986-kube-api-access-kzm5g\") pod \"cert-manager-cainjector-7f985d654d-7g9zf\" (UID: \"c726284c-5ca2-4d63-b96e-56c1aa537986\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.808122 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9fll\" (UniqueName: \"kubernetes.io/projected/314fd865-0086-4bd8-8a90-0c992557a6af-kube-api-access-s9fll\") pod \"cert-manager-webhook-5655c58dd6-9flwc\" (UID: \"314fd865-0086-4bd8-8a90-0c992557a6af\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.909203 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzm5g\" (UniqueName: \"kubernetes.io/projected/c726284c-5ca2-4d63-b96e-56c1aa537986-kube-api-access-kzm5g\") pod \"cert-manager-cainjector-7f985d654d-7g9zf\" (UID: \"c726284c-5ca2-4d63-b96e-56c1aa537986\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.909266 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9fll\" (UniqueName: \"kubernetes.io/projected/314fd865-0086-4bd8-8a90-0c992557a6af-kube-api-access-s9fll\") pod \"cert-manager-webhook-5655c58dd6-9flwc\" (UID: \"314fd865-0086-4bd8-8a90-0c992557a6af\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.909334 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smks2\" (UniqueName: \"kubernetes.io/projected/7a8b2c5d-38a2-4866-a1e0-8df9b4659c15-kube-api-access-smks2\") pod \"cert-manager-5b446d88c5-btc2c\" (UID: \"7a8b2c5d-38a2-4866-a1e0-8df9b4659c15\") " pod="cert-manager/cert-manager-5b446d88c5-btc2c" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.927491 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzm5g\" (UniqueName: \"kubernetes.io/projected/c726284c-5ca2-4d63-b96e-56c1aa537986-kube-api-access-kzm5g\") pod \"cert-manager-cainjector-7f985d654d-7g9zf\" (UID: \"c726284c-5ca2-4d63-b96e-56c1aa537986\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.928220 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smks2\" (UniqueName: \"kubernetes.io/projected/7a8b2c5d-38a2-4866-a1e0-8df9b4659c15-kube-api-access-smks2\") pod \"cert-manager-5b446d88c5-btc2c\" (UID: \"7a8b2c5d-38a2-4866-a1e0-8df9b4659c15\") " pod="cert-manager/cert-manager-5b446d88c5-btc2c" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.931474 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9fll\" (UniqueName: \"kubernetes.io/projected/314fd865-0086-4bd8-8a90-0c992557a6af-kube-api-access-s9fll\") pod \"cert-manager-webhook-5655c58dd6-9flwc\" (UID: \"314fd865-0086-4bd8-8a90-0c992557a6af\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.995943 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="c6d89efc3b8d912824669f2434ad38318f78ba91caa1db76769d7947e2583b0f" exitCode=0 Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.995982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"c6d89efc3b8d912824669f2434ad38318f78ba91caa1db76769d7947e2583b0f"} Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.996006 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"4988e2f99ae9422660aeb112dbeb7f72ef85e0a64c0c7a60db05121da7a422d0"} Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.996021 4669 scope.go:117] "RemoveContainer" containerID="24bba86ffc0208cf280d07da074223933749a4ce672ce4bf5741aa000c003ce6" Oct 08 20:54:13 crc kubenswrapper[4669]: I1008 20:54:13.999727 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" Oct 08 20:54:14 crc kubenswrapper[4669]: I1008 20:54:14.013572 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-btc2c" Oct 08 20:54:14 crc kubenswrapper[4669]: I1008 20:54:14.030028 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:14 crc kubenswrapper[4669]: I1008 20:54:14.260345 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-btc2c"] Oct 08 20:54:14 crc kubenswrapper[4669]: I1008 20:54:14.275570 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 20:54:14 crc kubenswrapper[4669]: I1008 20:54:14.293751 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9flwc"] Oct 08 20:54:14 crc kubenswrapper[4669]: W1008 20:54:14.300741 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod314fd865_0086_4bd8_8a90_0c992557a6af.slice/crio-83b90f8cf73a814913a6637d44a332a6b606a8e8f2f91e401aaf084633838f02 WatchSource:0}: Error finding container 83b90f8cf73a814913a6637d44a332a6b606a8e8f2f91e401aaf084633838f02: Status 404 returned error can't find the container with id 83b90f8cf73a814913a6637d44a332a6b606a8e8f2f91e401aaf084633838f02 Oct 08 20:54:14 crc kubenswrapper[4669]: I1008 20:54:14.412592 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-7g9zf"] Oct 08 20:54:14 crc kubenswrapper[4669]: W1008 20:54:14.416588 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc726284c_5ca2_4d63_b96e_56c1aa537986.slice/crio-861c1ae56a566cc02b3ca2b3b6f45008bf5ec89a0a0517c5b5e313b709def209 WatchSource:0}: Error finding container 861c1ae56a566cc02b3ca2b3b6f45008bf5ec89a0a0517c5b5e313b709def209: Status 404 returned error can't find the container with id 861c1ae56a566cc02b3ca2b3b6f45008bf5ec89a0a0517c5b5e313b709def209 Oct 08 20:54:15 crc kubenswrapper[4669]: I1008 20:54:15.009727 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" event={"ID":"314fd865-0086-4bd8-8a90-0c992557a6af","Type":"ContainerStarted","Data":"83b90f8cf73a814913a6637d44a332a6b606a8e8f2f91e401aaf084633838f02"} Oct 08 20:54:15 crc kubenswrapper[4669]: I1008 20:54:15.010937 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-btc2c" event={"ID":"7a8b2c5d-38a2-4866-a1e0-8df9b4659c15","Type":"ContainerStarted","Data":"56eef601bffe0f48ab77ef4a3864bc2da2ac9c5268b002fbfa623e6400a585c6"} Oct 08 20:54:15 crc kubenswrapper[4669]: I1008 20:54:15.012306 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" event={"ID":"c726284c-5ca2-4d63-b96e-56c1aa537986","Type":"ContainerStarted","Data":"861c1ae56a566cc02b3ca2b3b6f45008bf5ec89a0a0517c5b5e313b709def209"} Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.030088 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-btc2c" event={"ID":"7a8b2c5d-38a2-4866-a1e0-8df9b4659c15","Type":"ContainerStarted","Data":"73d3b98ccc6f98fba74d4541ab5a5ba9f55dfc30fb18c0a6306b22e6d04b4948"} Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.031111 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" event={"ID":"c726284c-5ca2-4d63-b96e-56c1aa537986","Type":"ContainerStarted","Data":"f5eb14bc3ee574f0e77c21b4bfd67193dff0dda13159111d80c630d83c0b5174"} Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.032260 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" event={"ID":"314fd865-0086-4bd8-8a90-0c992557a6af","Type":"ContainerStarted","Data":"ec4979dc50ee4b453de078558426b53a3e9cac97a427addb8a094502e67eae41"} Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.032433 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.046182 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-btc2c" podStartSLOduration=1.805034467 podStartE2EDuration="5.046162219s" podCreationTimestamp="2025-10-08 20:54:13 +0000 UTC" firstStartedPulling="2025-10-08 20:54:14.275300612 +0000 UTC m=+573.968111285" lastFinishedPulling="2025-10-08 20:54:17.516428364 +0000 UTC m=+577.209239037" observedRunningTime="2025-10-08 20:54:18.042387705 +0000 UTC m=+577.735198378" watchObservedRunningTime="2025-10-08 20:54:18.046162219 +0000 UTC m=+577.738972892" Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.078389 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-7g9zf" podStartSLOduration=1.948887595 podStartE2EDuration="5.078375398s" podCreationTimestamp="2025-10-08 20:54:13 +0000 UTC" firstStartedPulling="2025-10-08 20:54:14.41879725 +0000 UTC m=+574.111607923" lastFinishedPulling="2025-10-08 20:54:17.548285053 +0000 UTC m=+577.241095726" observedRunningTime="2025-10-08 20:54:18.077654987 +0000 UTC m=+577.770465680" watchObservedRunningTime="2025-10-08 20:54:18.078375398 +0000 UTC m=+577.771186071" Oct 08 20:54:18 crc kubenswrapper[4669]: I1008 20:54:18.080007 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" podStartSLOduration=1.853677046 podStartE2EDuration="5.079998822s" podCreationTimestamp="2025-10-08 20:54:13 +0000 UTC" firstStartedPulling="2025-10-08 20:54:14.302786068 +0000 UTC m=+573.995596741" lastFinishedPulling="2025-10-08 20:54:17.529107844 +0000 UTC m=+577.221918517" observedRunningTime="2025-10-08 20:54:18.061801841 +0000 UTC m=+577.754612514" watchObservedRunningTime="2025-10-08 20:54:18.079998822 +0000 UTC m=+577.772809495" Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.846962 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpzdw"] Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.848072 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-controller" containerID="cri-o://ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.848743 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="sbdb" containerID="cri-o://f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.848822 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="nbdb" containerID="cri-o://b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.848886 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="northd" containerID="cri-o://c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.849017 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.849283 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-node" containerID="cri-o://408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.849341 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-acl-logging" containerID="cri-o://92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" gracePeriod=30 Oct 08 20:54:23 crc kubenswrapper[4669]: I1008 20:54:23.895588 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" containerID="cri-o://698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" gracePeriod=30 Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.032516 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-9flwc" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.069631 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/2.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.070048 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/1.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.070088 4669 generic.go:334] "Generic (PLEG): container finished" podID="2433400c-98f8-490f-a566-00a330a738fe" containerID="75f5b6d8d782c36aa2c69c94e49c4f5f2bcd8290971bfccd34c4de96d2fa34a3" exitCode=2 Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.070144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerDied","Data":"75f5b6d8d782c36aa2c69c94e49c4f5f2bcd8290971bfccd34c4de96d2fa34a3"} Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.070181 4669 scope.go:117] "RemoveContainer" containerID="2b65ccfb3651377dd7136e083c72c94dbdef0e945e796bf851e7ba8e53aafd12" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.070725 4669 scope.go:117] "RemoveContainer" containerID="75f5b6d8d782c36aa2c69c94e49c4f5f2bcd8290971bfccd34c4de96d2fa34a3" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.070939 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-klx9r_openshift-multus(2433400c-98f8-490f-a566-00a330a738fe)\"" pod="openshift-multus/multus-klx9r" podUID="2433400c-98f8-490f-a566-00a330a738fe" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.073783 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/3.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.082625 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovn-acl-logging/0.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083115 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovn-controller/0.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083442 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" exitCode=0 Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083465 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" exitCode=0 Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083472 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" exitCode=143 Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083479 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" exitCode=143 Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083499 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69"} Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083522 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7"} Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083545 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52"} Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.083554 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470"} Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.362723 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 is running failed: container process not found" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.362743 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 is running failed: container process not found" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.363200 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 is running failed: container process not found" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.363325 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 is running failed: container process not found" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.363447 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 is running failed: container process not found" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.363499 4669 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="nbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.363719 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 is running failed: container process not found" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.363758 4669 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="sbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.653814 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/3.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.657055 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovn-acl-logging/0.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.657798 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovn-controller/0.log" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.658447 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.711727 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9w899"] Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.711920 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.711935 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.711944 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.711951 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.711961 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="nbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.711967 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="nbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.711974 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-node" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.711979 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-node" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.711986 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.711992 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712001 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kubecfg-setup" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712006 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kubecfg-setup" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712013 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712019 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712028 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="sbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712034 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="sbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712051 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="northd" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712063 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="northd" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712074 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712083 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712091 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-acl-logging" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712096 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-acl-logging" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712180 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712190 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-node" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712197 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712203 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712210 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="northd" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712219 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712227 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="nbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712236 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712242 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="sbdb" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712251 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovn-acl-logging" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712349 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712356 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: E1008 20:54:24.712366 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712372 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712453 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.712462 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerName="ovnkube-controller" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.713903 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858154 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-ovn\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858230 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858303 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zqxk\" (UniqueName: \"kubernetes.io/projected/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-kube-api-access-4zqxk\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858333 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-kubelet\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858362 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-systemd\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858388 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-var-lib-openvswitch\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858413 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-ovn-kubernetes\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858446 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-slash\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858473 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-netns\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858502 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-bin\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858553 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-etc-openvswitch\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858596 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-netd\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858637 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-openvswitch\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858678 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-env-overrides\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858719 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovn-node-metrics-cert\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858743 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-systemd-units\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858769 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-config\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858761 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858792 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-script-lib\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858814 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858889 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858889 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-log-socket\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-log-socket" (OuterVolumeSpecName: "log-socket") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.858954 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-node-log\") pod \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\" (UID: \"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7\") " Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859135 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-var-lib-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66f6g\" (UniqueName: \"kubernetes.io/projected/851fe964-4456-42d7-8b3b-edc05a2e1a04-kube-api-access-66f6g\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859225 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-node-log\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859273 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-systemd\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859296 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-etc-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859308 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859330 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-cni-netd\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859353 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-run-ovn-kubernetes\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859366 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859377 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859414 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-ovn\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859445 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859473 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-slash\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859505 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovnkube-script-lib\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859565 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-systemd-units\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859595 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovnkube-config\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859609 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859622 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovn-node-metrics-cert\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859661 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-node-log" (OuterVolumeSpecName: "node-log") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859727 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-kubelet\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859741 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859767 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-run-netns\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859783 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859824 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859842 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859856 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-env-overrides\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859865 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-slash" (OuterVolumeSpecName: "host-slash") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859899 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859936 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859935 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859946 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-cni-bin\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.859785 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860085 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-log-socket\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860146 4669 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860161 4669 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860178 4669 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860229 4669 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860268 4669 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860298 4669 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-slash\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860329 4669 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860356 4669 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860383 4669 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860410 4669 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860436 4669 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860460 4669 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860486 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860505 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860523 4669 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-log-socket\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860576 4669 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-node-log\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.860600 4669 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.864962 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.867415 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-kube-api-access-4zqxk" (OuterVolumeSpecName: "kube-api-access-4zqxk") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "kube-api-access-4zqxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.876249 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" (UID: "cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.960958 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-env-overrides\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.960991 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-cni-bin\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961023 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-log-socket\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-var-lib-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66f6g\" (UniqueName: \"kubernetes.io/projected/851fe964-4456-42d7-8b3b-edc05a2e1a04-kube-api-access-66f6g\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961088 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-node-log\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961124 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-systemd\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961142 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-etc-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961157 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-cni-netd\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961171 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961188 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-run-ovn-kubernetes\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961208 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-ovn\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961228 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961244 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-slash\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961267 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovnkube-script-lib\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961289 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-systemd-units\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961306 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovnkube-config\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961325 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovn-node-metrics-cert\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961342 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-kubelet\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961378 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-run-netns\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961413 4669 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961424 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961434 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zqxk\" (UniqueName: \"kubernetes.io/projected/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7-kube-api-access-4zqxk\") on node \"crc\" DevicePath \"\"" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961469 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-run-netns\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961505 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-cni-bin\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961552 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-log-socket\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961575 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-var-lib-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961608 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-env-overrides\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961673 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-ovn\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961700 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-node-log\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961723 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-systemd\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961747 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-etc-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961780 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-cni-netd\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961807 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961832 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-run-openvswitch\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961858 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-slash\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961857 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-systemd-units\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.962363 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovnkube-script-lib\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.962405 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-kubelet\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.962450 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovnkube-config\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.961834 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851fe964-4456-42d7-8b3b-edc05a2e1a04-host-run-ovn-kubernetes\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.965855 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/851fe964-4456-42d7-8b3b-edc05a2e1a04-ovn-node-metrics-cert\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:24 crc kubenswrapper[4669]: I1008 20:54:24.976306 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66f6g\" (UniqueName: \"kubernetes.io/projected/851fe964-4456-42d7-8b3b-edc05a2e1a04-kube-api-access-66f6g\") pod \"ovnkube-node-9w899\" (UID: \"851fe964-4456-42d7-8b3b-edc05a2e1a04\") " pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.031797 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.089344 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"ae8091344543f828558603a6328484b100d30ebb3cb3b5a3db437decca61017e"} Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.091080 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/2.log" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.093410 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovnkube-controller/3.log" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.095689 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovn-acl-logging/0.log" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096179 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpzdw_cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/ovn-controller/0.log" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096663 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" exitCode=0 Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096750 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" exitCode=0 Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096764 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" exitCode=0 Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096775 4669 generic.go:334] "Generic (PLEG): container finished" podID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" exitCode=0 Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096716 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f"} Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096809 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111"} Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096827 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9"} Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096839 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087"} Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096848 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" event={"ID":"cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7","Type":"ContainerDied","Data":"e9250f8f75d073de984775a81aaafd543292b62ba50216285cd6f0ae77ca9b8b"} Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.096859 4669 scope.go:117] "RemoveContainer" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.097010 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpzdw" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.125379 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.143596 4669 scope.go:117] "RemoveContainer" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.148363 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpzdw"] Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.151776 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpzdw"] Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.179000 4669 scope.go:117] "RemoveContainer" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.194695 4669 scope.go:117] "RemoveContainer" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.212744 4669 scope.go:117] "RemoveContainer" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.233878 4669 scope.go:117] "RemoveContainer" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.255396 4669 scope.go:117] "RemoveContainer" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.281896 4669 scope.go:117] "RemoveContainer" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.344159 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7" path="/var/lib/kubelet/pods/cddcfdc4-9f9b-4e6a-a9e2-87dbba119da7/volumes" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.363400 4669 scope.go:117] "RemoveContainer" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.381282 4669 scope.go:117] "RemoveContainer" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.381798 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": container with ID starting with 698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f not found: ID does not exist" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.381848 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f"} err="failed to get container status \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": rpc error: code = NotFound desc = could not find container \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": container with ID starting with 698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.381878 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.382359 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": container with ID starting with a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4 not found: ID does not exist" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.382409 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4"} err="failed to get container status \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": rpc error: code = NotFound desc = could not find container \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": container with ID starting with a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.382442 4669 scope.go:117] "RemoveContainer" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.383187 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": container with ID starting with f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 not found: ID does not exist" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.383217 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111"} err="failed to get container status \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": rpc error: code = NotFound desc = could not find container \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": container with ID starting with f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.383235 4669 scope.go:117] "RemoveContainer" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.383518 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": container with ID starting with b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 not found: ID does not exist" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.383573 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9"} err="failed to get container status \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": rpc error: code = NotFound desc = could not find container \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": container with ID starting with b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.383608 4669 scope.go:117] "RemoveContainer" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.384117 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": container with ID starting with c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087 not found: ID does not exist" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.384138 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087"} err="failed to get container status \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": rpc error: code = NotFound desc = could not find container \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": container with ID starting with c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.384154 4669 scope.go:117] "RemoveContainer" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.384680 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": container with ID starting with 334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69 not found: ID does not exist" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.384711 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69"} err="failed to get container status \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": rpc error: code = NotFound desc = could not find container \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": container with ID starting with 334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.384730 4669 scope.go:117] "RemoveContainer" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.385071 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": container with ID starting with 408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7 not found: ID does not exist" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.385103 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7"} err="failed to get container status \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": rpc error: code = NotFound desc = could not find container \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": container with ID starting with 408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.385122 4669 scope.go:117] "RemoveContainer" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.385594 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": container with ID starting with 92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52 not found: ID does not exist" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.385625 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52"} err="failed to get container status \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": rpc error: code = NotFound desc = could not find container \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": container with ID starting with 92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.385644 4669 scope.go:117] "RemoveContainer" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.386040 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": container with ID starting with ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470 not found: ID does not exist" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.386077 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470"} err="failed to get container status \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": rpc error: code = NotFound desc = could not find container \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": container with ID starting with ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.386098 4669 scope.go:117] "RemoveContainer" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" Oct 08 20:54:25 crc kubenswrapper[4669]: E1008 20:54:25.386334 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": container with ID starting with 714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793 not found: ID does not exist" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.386360 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793"} err="failed to get container status \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": rpc error: code = NotFound desc = could not find container \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": container with ID starting with 714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.386377 4669 scope.go:117] "RemoveContainer" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.386875 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f"} err="failed to get container status \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": rpc error: code = NotFound desc = could not find container \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": container with ID starting with 698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.386904 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.387219 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4"} err="failed to get container status \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": rpc error: code = NotFound desc = could not find container \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": container with ID starting with a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.387249 4669 scope.go:117] "RemoveContainer" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.387518 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111"} err="failed to get container status \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": rpc error: code = NotFound desc = could not find container \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": container with ID starting with f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.387558 4669 scope.go:117] "RemoveContainer" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.387892 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9"} err="failed to get container status \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": rpc error: code = NotFound desc = could not find container \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": container with ID starting with b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.387915 4669 scope.go:117] "RemoveContainer" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.388182 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087"} err="failed to get container status \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": rpc error: code = NotFound desc = could not find container \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": container with ID starting with c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.388206 4669 scope.go:117] "RemoveContainer" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.388416 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69"} err="failed to get container status \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": rpc error: code = NotFound desc = could not find container \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": container with ID starting with 334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.388443 4669 scope.go:117] "RemoveContainer" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.388757 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7"} err="failed to get container status \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": rpc error: code = NotFound desc = could not find container \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": container with ID starting with 408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.388780 4669 scope.go:117] "RemoveContainer" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389079 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52"} err="failed to get container status \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": rpc error: code = NotFound desc = could not find container \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": container with ID starting with 92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389101 4669 scope.go:117] "RemoveContainer" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389340 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470"} err="failed to get container status \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": rpc error: code = NotFound desc = could not find container \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": container with ID starting with ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389359 4669 scope.go:117] "RemoveContainer" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389572 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793"} err="failed to get container status \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": rpc error: code = NotFound desc = could not find container \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": container with ID starting with 714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389588 4669 scope.go:117] "RemoveContainer" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389780 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f"} err="failed to get container status \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": rpc error: code = NotFound desc = could not find container \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": container with ID starting with 698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.389807 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390145 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4"} err="failed to get container status \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": rpc error: code = NotFound desc = could not find container \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": container with ID starting with a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390165 4669 scope.go:117] "RemoveContainer" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390374 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111"} err="failed to get container status \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": rpc error: code = NotFound desc = could not find container \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": container with ID starting with f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390403 4669 scope.go:117] "RemoveContainer" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390679 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9"} err="failed to get container status \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": rpc error: code = NotFound desc = could not find container \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": container with ID starting with b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390697 4669 scope.go:117] "RemoveContainer" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390958 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087"} err="failed to get container status \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": rpc error: code = NotFound desc = could not find container \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": container with ID starting with c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.390974 4669 scope.go:117] "RemoveContainer" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.391242 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69"} err="failed to get container status \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": rpc error: code = NotFound desc = could not find container \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": container with ID starting with 334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.391261 4669 scope.go:117] "RemoveContainer" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.391678 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7"} err="failed to get container status \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": rpc error: code = NotFound desc = could not find container \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": container with ID starting with 408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.391698 4669 scope.go:117] "RemoveContainer" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.392010 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52"} err="failed to get container status \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": rpc error: code = NotFound desc = could not find container \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": container with ID starting with 92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.392027 4669 scope.go:117] "RemoveContainer" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.392314 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470"} err="failed to get container status \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": rpc error: code = NotFound desc = could not find container \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": container with ID starting with ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.392344 4669 scope.go:117] "RemoveContainer" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.392737 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793"} err="failed to get container status \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": rpc error: code = NotFound desc = could not find container \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": container with ID starting with 714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.392757 4669 scope.go:117] "RemoveContainer" containerID="698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.393085 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f"} err="failed to get container status \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": rpc error: code = NotFound desc = could not find container \"698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f\": container with ID starting with 698cf19fa72b0c3d6db6643a79a42a1529cff7d8d9e6d25b29feff483425122f not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.393139 4669 scope.go:117] "RemoveContainer" containerID="a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.393416 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4"} err="failed to get container status \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": rpc error: code = NotFound desc = could not find container \"a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4\": container with ID starting with a36045ff8e9ec3ba7cb5e5d4cc2c7ac3e970c45098c3bc2fd385134fecaa8db4 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.393444 4669 scope.go:117] "RemoveContainer" containerID="f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.393806 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111"} err="failed to get container status \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": rpc error: code = NotFound desc = could not find container \"f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111\": container with ID starting with f8390e480d9483a30cfda325bad8a46b00be0c159c5bcea12e3eb3294671b111 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.393828 4669 scope.go:117] "RemoveContainer" containerID="b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.394240 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9"} err="failed to get container status \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": rpc error: code = NotFound desc = could not find container \"b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9\": container with ID starting with b13639c06ad7dd7707accd87d2255ba508c3872e64f4e82c09222ffa35bd8be9 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.394264 4669 scope.go:117] "RemoveContainer" containerID="c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.394653 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087"} err="failed to get container status \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": rpc error: code = NotFound desc = could not find container \"c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087\": container with ID starting with c03e0c827468d80fa326ee46ee88ad6adfe4236f4df9843324d2b247d0716087 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.394682 4669 scope.go:117] "RemoveContainer" containerID="334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.394988 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69"} err="failed to get container status \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": rpc error: code = NotFound desc = could not find container \"334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69\": container with ID starting with 334a09deac921308c4d6053bdcc2bbc096acc8ec24875190efb1c07b22d01c69 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.395039 4669 scope.go:117] "RemoveContainer" containerID="408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.395568 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7"} err="failed to get container status \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": rpc error: code = NotFound desc = could not find container \"408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7\": container with ID starting with 408dd840918000b1689c3d828a51173deebf8d00fc97450975b35e5149d3cfc7 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.395597 4669 scope.go:117] "RemoveContainer" containerID="92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.396050 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52"} err="failed to get container status \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": rpc error: code = NotFound desc = could not find container \"92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52\": container with ID starting with 92bc23ad705dcc8b8524159bc37254ce2306e7b502b914eaac7a6525fdd44f52 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.396069 4669 scope.go:117] "RemoveContainer" containerID="ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.396599 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470"} err="failed to get container status \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": rpc error: code = NotFound desc = could not find container \"ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470\": container with ID starting with ed9a574189bcc7f84b93c5e821e944b0f94679084a30270d6634c7d19e67c470 not found: ID does not exist" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.396627 4669 scope.go:117] "RemoveContainer" containerID="714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793" Oct 08 20:54:25 crc kubenswrapper[4669]: I1008 20:54:25.397331 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793"} err="failed to get container status \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": rpc error: code = NotFound desc = could not find container \"714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793\": container with ID starting with 714cce1b094db0b40ac3b788a76645047f4a51e231670b78128f0281b04d2793 not found: ID does not exist" Oct 08 20:54:26 crc kubenswrapper[4669]: I1008 20:54:26.107016 4669 generic.go:334] "Generic (PLEG): container finished" podID="851fe964-4456-42d7-8b3b-edc05a2e1a04" containerID="40870f2731fe6ce401dc36dbb8380050d82410bf076e0f30f4e9e56a2a9f3f5e" exitCode=0 Oct 08 20:54:26 crc kubenswrapper[4669]: I1008 20:54:26.107075 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerDied","Data":"40870f2731fe6ce401dc36dbb8380050d82410bf076e0f30f4e9e56a2a9f3f5e"} Oct 08 20:54:27 crc kubenswrapper[4669]: I1008 20:54:27.116894 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"a8628523c2d6ba907560ef079e6a69964e19fb77bd9bf60962015044e14b8339"} Oct 08 20:54:27 crc kubenswrapper[4669]: I1008 20:54:27.117466 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"65260ff748a3490464073f2b57bd0941b4eefb511538f630aeccf301afc6cdf0"} Oct 08 20:54:27 crc kubenswrapper[4669]: I1008 20:54:27.117482 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"6e4c989675ddb4b17918113609298466358378f154a20eb6c95d6d83402bb1c6"} Oct 08 20:54:27 crc kubenswrapper[4669]: I1008 20:54:27.117494 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"c84e8d0d69fe32c76e60476fe48306984fcb968b75465964df5968aab23fb668"} Oct 08 20:54:27 crc kubenswrapper[4669]: I1008 20:54:27.117503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"dd83a30b54e69945222cabce0e8590582b4c171a2948e9e43c5aad656cb9b999"} Oct 08 20:54:27 crc kubenswrapper[4669]: I1008 20:54:27.117512 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"c5a6138b412b689c13920be03685846dd38752a1a24f182d0ed6923d0fdc4bf7"} Oct 08 20:54:30 crc kubenswrapper[4669]: I1008 20:54:30.139353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"1df844d8c97cd961a5ac47d040c2e0fa439b64815f6517b641ee7db9295720fb"} Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.151886 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" event={"ID":"851fe964-4456-42d7-8b3b-edc05a2e1a04","Type":"ContainerStarted","Data":"1a1978b37423dba14ff21f9f78067c47805a9a21ba92408e44361ceab6c32b69"} Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.153610 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.153656 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.153721 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.214357 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.219644 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" podStartSLOduration=8.219626373 podStartE2EDuration="8.219626373s" podCreationTimestamp="2025-10-08 20:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:54:32.214859542 +0000 UTC m=+591.907670275" watchObservedRunningTime="2025-10-08 20:54:32.219626373 +0000 UTC m=+591.912437076" Oct 08 20:54:32 crc kubenswrapper[4669]: I1008 20:54:32.223289 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:54:36 crc kubenswrapper[4669]: I1008 20:54:36.331389 4669 scope.go:117] "RemoveContainer" containerID="75f5b6d8d782c36aa2c69c94e49c4f5f2bcd8290971bfccd34c4de96d2fa34a3" Oct 08 20:54:36 crc kubenswrapper[4669]: E1008 20:54:36.332502 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-klx9r_openshift-multus(2433400c-98f8-490f-a566-00a330a738fe)\"" pod="openshift-multus/multus-klx9r" podUID="2433400c-98f8-490f-a566-00a330a738fe" Oct 08 20:54:49 crc kubenswrapper[4669]: I1008 20:54:49.330998 4669 scope.go:117] "RemoveContainer" containerID="75f5b6d8d782c36aa2c69c94e49c4f5f2bcd8290971bfccd34c4de96d2fa34a3" Oct 08 20:54:50 crc kubenswrapper[4669]: I1008 20:54:50.264449 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klx9r_2433400c-98f8-490f-a566-00a330a738fe/kube-multus/2.log" Oct 08 20:54:50 crc kubenswrapper[4669]: I1008 20:54:50.264911 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klx9r" event={"ID":"2433400c-98f8-490f-a566-00a330a738fe","Type":"ContainerStarted","Data":"433578ef6e2c8fa59f23df3cee8b4cca0b19de1baa673232c40d522e9ef81f36"} Oct 08 20:54:55 crc kubenswrapper[4669]: I1008 20:54:55.065633 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9w899" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.537840 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl"] Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.539273 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.541471 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.551441 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl"] Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.592143 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.592211 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.592240 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqnd9\" (UniqueName: \"kubernetes.io/projected/29a7cdfe-bc26-4687-9359-e829d21b4137-kube-api-access-vqnd9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.694187 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.694260 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.694302 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqnd9\" (UniqueName: \"kubernetes.io/projected/29a7cdfe-bc26-4687-9359-e829d21b4137-kube-api-access-vqnd9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.694761 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.695063 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.716500 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqnd9\" (UniqueName: \"kubernetes.io/projected/29a7cdfe-bc26-4687-9359-e829d21b4137-kube-api-access-vqnd9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:02 crc kubenswrapper[4669]: I1008 20:55:02.857103 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:03 crc kubenswrapper[4669]: I1008 20:55:03.238682 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl"] Oct 08 20:55:03 crc kubenswrapper[4669]: W1008 20:55:03.244782 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a7cdfe_bc26_4687_9359_e829d21b4137.slice/crio-a98cc173c5da2877b875c4d0dabb4ef43e18b78711514ac89603e56f7da00789 WatchSource:0}: Error finding container a98cc173c5da2877b875c4d0dabb4ef43e18b78711514ac89603e56f7da00789: Status 404 returned error can't find the container with id a98cc173c5da2877b875c4d0dabb4ef43e18b78711514ac89603e56f7da00789 Oct 08 20:55:03 crc kubenswrapper[4669]: I1008 20:55:03.340109 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" event={"ID":"29a7cdfe-bc26-4687-9359-e829d21b4137","Type":"ContainerStarted","Data":"a98cc173c5da2877b875c4d0dabb4ef43e18b78711514ac89603e56f7da00789"} Oct 08 20:55:04 crc kubenswrapper[4669]: I1008 20:55:04.347470 4669 generic.go:334] "Generic (PLEG): container finished" podID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerID="731824386cc2560b5e1259b70d32116467b3f4d9ae1a92bf17226f0bb73c267b" exitCode=0 Oct 08 20:55:04 crc kubenswrapper[4669]: I1008 20:55:04.347550 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" event={"ID":"29a7cdfe-bc26-4687-9359-e829d21b4137","Type":"ContainerDied","Data":"731824386cc2560b5e1259b70d32116467b3f4d9ae1a92bf17226f0bb73c267b"} Oct 08 20:55:06 crc kubenswrapper[4669]: I1008 20:55:06.369240 4669 generic.go:334] "Generic (PLEG): container finished" podID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerID="c31c4fed6627f6665f024be5ab60a1943d9d61f328999e69b6516af8f3f21b76" exitCode=0 Oct 08 20:55:06 crc kubenswrapper[4669]: I1008 20:55:06.369744 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" event={"ID":"29a7cdfe-bc26-4687-9359-e829d21b4137","Type":"ContainerDied","Data":"c31c4fed6627f6665f024be5ab60a1943d9d61f328999e69b6516af8f3f21b76"} Oct 08 20:55:07 crc kubenswrapper[4669]: I1008 20:55:07.376866 4669 generic.go:334] "Generic (PLEG): container finished" podID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerID="d30dabeef06760d6a5b9fa3e7d2bf6620e5585d94faa6b7a2ef79dafc29ee97d" exitCode=0 Oct 08 20:55:07 crc kubenswrapper[4669]: I1008 20:55:07.376980 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" event={"ID":"29a7cdfe-bc26-4687-9359-e829d21b4137","Type":"ContainerDied","Data":"d30dabeef06760d6a5b9fa3e7d2bf6620e5585d94faa6b7a2ef79dafc29ee97d"} Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.654245 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.677881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-util\") pod \"29a7cdfe-bc26-4687-9359-e829d21b4137\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.677931 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-bundle\") pod \"29a7cdfe-bc26-4687-9359-e829d21b4137\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.677983 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqnd9\" (UniqueName: \"kubernetes.io/projected/29a7cdfe-bc26-4687-9359-e829d21b4137-kube-api-access-vqnd9\") pod \"29a7cdfe-bc26-4687-9359-e829d21b4137\" (UID: \"29a7cdfe-bc26-4687-9359-e829d21b4137\") " Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.678985 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-bundle" (OuterVolumeSpecName: "bundle") pod "29a7cdfe-bc26-4687-9359-e829d21b4137" (UID: "29a7cdfe-bc26-4687-9359-e829d21b4137"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.682753 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a7cdfe-bc26-4687-9359-e829d21b4137-kube-api-access-vqnd9" (OuterVolumeSpecName: "kube-api-access-vqnd9") pod "29a7cdfe-bc26-4687-9359-e829d21b4137" (UID: "29a7cdfe-bc26-4687-9359-e829d21b4137"). InnerVolumeSpecName "kube-api-access-vqnd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.698666 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-util" (OuterVolumeSpecName: "util") pod "29a7cdfe-bc26-4687-9359-e829d21b4137" (UID: "29a7cdfe-bc26-4687-9359-e829d21b4137"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.779059 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqnd9\" (UniqueName: \"kubernetes.io/projected/29a7cdfe-bc26-4687-9359-e829d21b4137-kube-api-access-vqnd9\") on node \"crc\" DevicePath \"\"" Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.779256 4669 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-util\") on node \"crc\" DevicePath \"\"" Oct 08 20:55:08 crc kubenswrapper[4669]: I1008 20:55:08.779331 4669 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a7cdfe-bc26-4687-9359-e829d21b4137-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:55:09 crc kubenswrapper[4669]: I1008 20:55:09.390728 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" event={"ID":"29a7cdfe-bc26-4687-9359-e829d21b4137","Type":"ContainerDied","Data":"a98cc173c5da2877b875c4d0dabb4ef43e18b78711514ac89603e56f7da00789"} Oct 08 20:55:09 crc kubenswrapper[4669]: I1008 20:55:09.390774 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98cc173c5da2877b875c4d0dabb4ef43e18b78711514ac89603e56f7da00789" Oct 08 20:55:09 crc kubenswrapper[4669]: I1008 20:55:09.390996 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.128996 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk"] Oct 08 20:55:14 crc kubenswrapper[4669]: E1008 20:55:14.129540 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="extract" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.129555 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="extract" Oct 08 20:55:14 crc kubenswrapper[4669]: E1008 20:55:14.129580 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="util" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.129587 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="util" Oct 08 20:55:14 crc kubenswrapper[4669]: E1008 20:55:14.129598 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="pull" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.129605 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="pull" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.129712 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a7cdfe-bc26-4687-9359-e829d21b4137" containerName="extract" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.130116 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.132494 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tjbph" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.135849 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.136211 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.151129 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk"] Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.255890 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74ft\" (UniqueName: \"kubernetes.io/projected/e3e2cb2d-8c68-46d0-a639-fd839c30a680-kube-api-access-j74ft\") pod \"nmstate-operator-858ddd8f98-bmpqk\" (UID: \"e3e2cb2d-8c68-46d0-a639-fd839c30a680\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.357028 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j74ft\" (UniqueName: \"kubernetes.io/projected/e3e2cb2d-8c68-46d0-a639-fd839c30a680-kube-api-access-j74ft\") pod \"nmstate-operator-858ddd8f98-bmpqk\" (UID: \"e3e2cb2d-8c68-46d0-a639-fd839c30a680\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.378058 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74ft\" (UniqueName: \"kubernetes.io/projected/e3e2cb2d-8c68-46d0-a639-fd839c30a680-kube-api-access-j74ft\") pod \"nmstate-operator-858ddd8f98-bmpqk\" (UID: \"e3e2cb2d-8c68-46d0-a639-fd839c30a680\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.449288 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" Oct 08 20:55:14 crc kubenswrapper[4669]: I1008 20:55:14.669631 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk"] Oct 08 20:55:15 crc kubenswrapper[4669]: I1008 20:55:15.427261 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" event={"ID":"e3e2cb2d-8c68-46d0-a639-fd839c30a680","Type":"ContainerStarted","Data":"d3946a4839ae019648b7f9460749ecd3d0d96486a820bfc0e658fa52ca80609a"} Oct 08 20:55:17 crc kubenswrapper[4669]: I1008 20:55:17.439141 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" event={"ID":"e3e2cb2d-8c68-46d0-a639-fd839c30a680","Type":"ContainerStarted","Data":"c356fe35d2856128a047da0b322ec73088cdb3674795db03c5186afbb9dfa863"} Oct 08 20:55:17 crc kubenswrapper[4669]: I1008 20:55:17.463482 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmpqk" podStartSLOduration=1.284618454 podStartE2EDuration="3.463461616s" podCreationTimestamp="2025-10-08 20:55:14 +0000 UTC" firstStartedPulling="2025-10-08 20:55:14.675128054 +0000 UTC m=+634.367938727" lastFinishedPulling="2025-10-08 20:55:16.853971216 +0000 UTC m=+636.546781889" observedRunningTime="2025-10-08 20:55:17.460843354 +0000 UTC m=+637.153654037" watchObservedRunningTime="2025-10-08 20:55:17.463461616 +0000 UTC m=+637.156272299" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.066905 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.068568 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.071565 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nvvjw" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.078759 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.079865 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.082023 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.094768 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.097437 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.139276 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dr78d"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.140124 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202359 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-nmstate-lock\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202413 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-ovs-socket\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202440 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn88w\" (UniqueName: \"kubernetes.io/projected/e975e2d7-d104-4d94-a624-e2bd680e2e23-kube-api-access-vn88w\") pod \"nmstate-metrics-fdff9cb8d-nqf8s\" (UID: \"e975e2d7-d104-4d94-a624-e2bd680e2e23\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202482 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvn9\" (UniqueName: \"kubernetes.io/projected/393d89be-66bc-40a8-99e2-a145ec3eebe8-kube-api-access-hqvn9\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202497 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4jb\" (UniqueName: \"kubernetes.io/projected/d67c051f-00e6-45c7-aa07-401695d5798f-kube-api-access-2n4jb\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-dbus-socket\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.202541 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d67c051f-00e6-45c7-aa07-401695d5798f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.287845 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.290302 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.295325 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jj8ml" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.295348 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.295569 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.304674 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305321 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-nmstate-lock\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305387 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-ovs-socket\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305430 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn88w\" (UniqueName: \"kubernetes.io/projected/e975e2d7-d104-4d94-a624-e2bd680e2e23-kube-api-access-vn88w\") pod \"nmstate-metrics-fdff9cb8d-nqf8s\" (UID: \"e975e2d7-d104-4d94-a624-e2bd680e2e23\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305496 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvn9\" (UniqueName: \"kubernetes.io/projected/393d89be-66bc-40a8-99e2-a145ec3eebe8-kube-api-access-hqvn9\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305522 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4jb\" (UniqueName: \"kubernetes.io/projected/d67c051f-00e6-45c7-aa07-401695d5798f-kube-api-access-2n4jb\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305566 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-dbus-socket\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.305587 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d67c051f-00e6-45c7-aa07-401695d5798f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: E1008 20:55:23.305750 4669 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 08 20:55:23 crc kubenswrapper[4669]: E1008 20:55:23.305820 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d67c051f-00e6-45c7-aa07-401695d5798f-tls-key-pair podName:d67c051f-00e6-45c7-aa07-401695d5798f nodeName:}" failed. No retries permitted until 2025-10-08 20:55:23.805798308 +0000 UTC m=+643.498608981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d67c051f-00e6-45c7-aa07-401695d5798f-tls-key-pair") pod "nmstate-webhook-6cdbc54649-57cwq" (UID: "d67c051f-00e6-45c7-aa07-401695d5798f") : secret "openshift-nmstate-webhook" not found Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.306341 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-nmstate-lock\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.306388 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-ovs-socket\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.309782 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/393d89be-66bc-40a8-99e2-a145ec3eebe8-dbus-socket\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.327470 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvn9\" (UniqueName: \"kubernetes.io/projected/393d89be-66bc-40a8-99e2-a145ec3eebe8-kube-api-access-hqvn9\") pod \"nmstate-handler-dr78d\" (UID: \"393d89be-66bc-40a8-99e2-a145ec3eebe8\") " pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.327605 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4jb\" (UniqueName: \"kubernetes.io/projected/d67c051f-00e6-45c7-aa07-401695d5798f-kube-api-access-2n4jb\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.327610 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn88w\" (UniqueName: \"kubernetes.io/projected/e975e2d7-d104-4d94-a624-e2bd680e2e23-kube-api-access-vn88w\") pod \"nmstate-metrics-fdff9cb8d-nqf8s\" (UID: \"e975e2d7-d104-4d94-a624-e2bd680e2e23\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.396128 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.407221 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.407312 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.407335 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj4jb\" (UniqueName: \"kubernetes.io/projected/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-kube-api-access-sj4jb\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.462169 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69487d6487-zt7w9"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.463233 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.464422 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.522608 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.522697 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.522727 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj4jb\" (UniqueName: \"kubernetes.io/projected/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-kube-api-access-sj4jb\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.524591 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.541096 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.542996 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj4jb\" (UniqueName: \"kubernetes.io/projected/e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53-kube-api-access-sj4jb\") pod \"nmstate-console-plugin-6b874cbd85-4ddnj\" (UID: \"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.550627 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69487d6487-zt7w9"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625307 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-console-config\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625366 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ffe65b93-6fcd-4813-864d-e94d73941213-console-oauth-config\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625391 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-trusted-ca-bundle\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625413 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe65b93-6fcd-4813-864d-e94d73941213-console-serving-cert\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-oauth-serving-cert\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625510 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-service-ca\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.625549 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlkfz\" (UniqueName: \"kubernetes.io/projected/ffe65b93-6fcd-4813-864d-e94d73941213-kube-api-access-jlkfz\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.627828 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730380 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-console-config\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730679 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ffe65b93-6fcd-4813-864d-e94d73941213-console-oauth-config\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730701 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-trusted-ca-bundle\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe65b93-6fcd-4813-864d-e94d73941213-console-serving-cert\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730772 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-oauth-serving-cert\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730799 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-service-ca\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.730842 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlkfz\" (UniqueName: \"kubernetes.io/projected/ffe65b93-6fcd-4813-864d-e94d73941213-kube-api-access-jlkfz\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.731992 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-console-config\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.732519 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-oauth-serving-cert\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.732900 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-service-ca\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.733661 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe65b93-6fcd-4813-864d-e94d73941213-trusted-ca-bundle\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.738185 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe65b93-6fcd-4813-864d-e94d73941213-console-serving-cert\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.738813 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ffe65b93-6fcd-4813-864d-e94d73941213-console-oauth-config\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.748827 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlkfz\" (UniqueName: \"kubernetes.io/projected/ffe65b93-6fcd-4813-864d-e94d73941213-kube-api-access-jlkfz\") pod \"console-69487d6487-zt7w9\" (UID: \"ffe65b93-6fcd-4813-864d-e94d73941213\") " pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.822263 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj"] Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.831656 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d67c051f-00e6-45c7-aa07-401695d5798f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.836363 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d67c051f-00e6-45c7-aa07-401695d5798f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-57cwq\" (UID: \"d67c051f-00e6-45c7-aa07-401695d5798f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.851201 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:23 crc kubenswrapper[4669]: I1008 20:55:23.886300 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s"] Oct 08 20:55:23 crc kubenswrapper[4669]: W1008 20:55:23.893010 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode975e2d7_d104_4d94_a624_e2bd680e2e23.slice/crio-d8c335f91302de557112b35e5c7987eff50bf777ce229dfcc0c03721d186bbfc WatchSource:0}: Error finding container d8c335f91302de557112b35e5c7987eff50bf777ce229dfcc0c03721d186bbfc: Status 404 returned error can't find the container with id d8c335f91302de557112b35e5c7987eff50bf777ce229dfcc0c03721d186bbfc Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.008407 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.060264 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69487d6487-zt7w9"] Oct 08 20:55:24 crc kubenswrapper[4669]: W1008 20:55:24.080053 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe65b93_6fcd_4813_864d_e94d73941213.slice/crio-5fccb730c906c82e80791519e59e5b87192ffe0472ac16c65777e5146e315c8f WatchSource:0}: Error finding container 5fccb730c906c82e80791519e59e5b87192ffe0472ac16c65777e5146e315c8f: Status 404 returned error can't find the container with id 5fccb730c906c82e80791519e59e5b87192ffe0472ac16c65777e5146e315c8f Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.181092 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq"] Oct 08 20:55:24 crc kubenswrapper[4669]: W1008 20:55:24.190165 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd67c051f_00e6_45c7_aa07_401695d5798f.slice/crio-f80c6ef6d6c08c000fe2774822285f4807454b5d8b684b92ad9902441e1bd78c WatchSource:0}: Error finding container f80c6ef6d6c08c000fe2774822285f4807454b5d8b684b92ad9902441e1bd78c: Status 404 returned error can't find the container with id f80c6ef6d6c08c000fe2774822285f4807454b5d8b684b92ad9902441e1bd78c Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.528172 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dr78d" event={"ID":"393d89be-66bc-40a8-99e2-a145ec3eebe8","Type":"ContainerStarted","Data":"e3bf727c4fe836ab66af7c8b115bcfb9b1778f057e67b9a873d8127dc6f3281c"} Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.529480 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69487d6487-zt7w9" event={"ID":"ffe65b93-6fcd-4813-864d-e94d73941213","Type":"ContainerStarted","Data":"aee401fc126b563be9ce4951ef676aacb5f3f7149a123ead5285f3590442d199"} Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.529504 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69487d6487-zt7w9" event={"ID":"ffe65b93-6fcd-4813-864d-e94d73941213","Type":"ContainerStarted","Data":"5fccb730c906c82e80791519e59e5b87192ffe0472ac16c65777e5146e315c8f"} Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.530909 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" event={"ID":"d67c051f-00e6-45c7-aa07-401695d5798f","Type":"ContainerStarted","Data":"f80c6ef6d6c08c000fe2774822285f4807454b5d8b684b92ad9902441e1bd78c"} Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.531632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" event={"ID":"e975e2d7-d104-4d94-a624-e2bd680e2e23","Type":"ContainerStarted","Data":"d8c335f91302de557112b35e5c7987eff50bf777ce229dfcc0c03721d186bbfc"} Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.532299 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" event={"ID":"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53","Type":"ContainerStarted","Data":"c8737501f2b398cfde7b2b683d05a794a8156105c580b34a05bfabea7a922a1a"} Oct 08 20:55:24 crc kubenswrapper[4669]: I1008 20:55:24.551491 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69487d6487-zt7w9" podStartSLOduration=1.55147332 podStartE2EDuration="1.55147332s" podCreationTimestamp="2025-10-08 20:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:55:24.549598989 +0000 UTC m=+644.242409662" watchObservedRunningTime="2025-10-08 20:55:24.55147332 +0000 UTC m=+644.244283993" Oct 08 20:55:29 crc kubenswrapper[4669]: I1008 20:55:29.591097 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" event={"ID":"e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53","Type":"ContainerStarted","Data":"c7077f45b9f3d78bcd441a9fb25279bdade41519fc2e8b384d3dbcbe95ef5e30"} Oct 08 20:55:29 crc kubenswrapper[4669]: I1008 20:55:29.594270 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dr78d" event={"ID":"393d89be-66bc-40a8-99e2-a145ec3eebe8","Type":"ContainerStarted","Data":"a1b7521f021ba34a73a63cf73ea62f2fc918e1a691e2e244fdd0a3fa2b27a244"} Oct 08 20:55:29 crc kubenswrapper[4669]: I1008 20:55:29.595829 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" event={"ID":"d67c051f-00e6-45c7-aa07-401695d5798f","Type":"ContainerStarted","Data":"c98aa22b24e9f6ae152b5e1f38bad92a311b5ba172d3ea544dfed0ac33965a77"} Oct 08 20:55:29 crc kubenswrapper[4669]: I1008 20:55:29.597323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" event={"ID":"e975e2d7-d104-4d94-a624-e2bd680e2e23","Type":"ContainerStarted","Data":"bbe61ace197ee4f60f6a50c70114ef695951d8ed79524a01b5a7f512ef58fe0b"} Oct 08 20:55:30 crc kubenswrapper[4669]: I1008 20:55:30.601944 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:30 crc kubenswrapper[4669]: I1008 20:55:30.619321 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" podStartSLOduration=2.96140863 podStartE2EDuration="7.619297739s" podCreationTimestamp="2025-10-08 20:55:23 +0000 UTC" firstStartedPulling="2025-10-08 20:55:24.192866683 +0000 UTC m=+643.885677356" lastFinishedPulling="2025-10-08 20:55:28.850755772 +0000 UTC m=+648.543566465" observedRunningTime="2025-10-08 20:55:30.619030023 +0000 UTC m=+650.311840736" watchObservedRunningTime="2025-10-08 20:55:30.619297739 +0000 UTC m=+650.312108412" Oct 08 20:55:30 crc kubenswrapper[4669]: I1008 20:55:30.643318 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dr78d" podStartSLOduration=2.3402546859999998 podStartE2EDuration="7.643294824s" podCreationTimestamp="2025-10-08 20:55:23 +0000 UTC" firstStartedPulling="2025-10-08 20:55:23.543412517 +0000 UTC m=+643.236223190" lastFinishedPulling="2025-10-08 20:55:28.846452655 +0000 UTC m=+648.539263328" observedRunningTime="2025-10-08 20:55:30.642353824 +0000 UTC m=+650.335164497" watchObservedRunningTime="2025-10-08 20:55:30.643294824 +0000 UTC m=+650.336105517" Oct 08 20:55:31 crc kubenswrapper[4669]: I1008 20:55:31.352506 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4ddnj" podStartSLOduration=4.078930957 podStartE2EDuration="8.352480954s" podCreationTimestamp="2025-10-08 20:55:23 +0000 UTC" firstStartedPulling="2025-10-08 20:55:23.831728375 +0000 UTC m=+643.524539048" lastFinishedPulling="2025-10-08 20:55:28.105278362 +0000 UTC m=+647.798089045" observedRunningTime="2025-10-08 20:55:30.660008637 +0000 UTC m=+650.352819310" watchObservedRunningTime="2025-10-08 20:55:31.352480954 +0000 UTC m=+651.045291667" Oct 08 20:55:33 crc kubenswrapper[4669]: I1008 20:55:33.851799 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:33 crc kubenswrapper[4669]: I1008 20:55:33.853194 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:33 crc kubenswrapper[4669]: I1008 20:55:33.858256 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:34 crc kubenswrapper[4669]: I1008 20:55:34.008968 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:34 crc kubenswrapper[4669]: I1008 20:55:34.622196 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" event={"ID":"e975e2d7-d104-4d94-a624-e2bd680e2e23","Type":"ContainerStarted","Data":"0908be4180a9731c33524a77a8fd50935e1d5a7fb9ce9e8241337c3832ff9b0a"} Oct 08 20:55:34 crc kubenswrapper[4669]: I1008 20:55:34.625284 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69487d6487-zt7w9" Oct 08 20:55:34 crc kubenswrapper[4669]: I1008 20:55:34.637222 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-nqf8s" podStartSLOduration=1.8895937219999999 podStartE2EDuration="11.63720303s" podCreationTimestamp="2025-10-08 20:55:23 +0000 UTC" firstStartedPulling="2025-10-08 20:55:23.894767015 +0000 UTC m=+643.587577688" lastFinishedPulling="2025-10-08 20:55:33.642376293 +0000 UTC m=+653.335186996" observedRunningTime="2025-10-08 20:55:34.636912613 +0000 UTC m=+654.329723286" watchObservedRunningTime="2025-10-08 20:55:34.63720303 +0000 UTC m=+654.330013693" Oct 08 20:55:34 crc kubenswrapper[4669]: I1008 20:55:34.698932 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4m5k5"] Oct 08 20:55:38 crc kubenswrapper[4669]: I1008 20:55:38.491950 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dr78d" Oct 08 20:55:44 crc kubenswrapper[4669]: I1008 20:55:44.016782 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-57cwq" Oct 08 20:55:59 crc kubenswrapper[4669]: I1008 20:55:59.748426 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4m5k5" podUID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" containerName="console" containerID="cri-o://4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272" gracePeriod=15 Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.395008 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4m5k5_4adb69f9-4f03-46c0-bd24-fec72c2b1fd9/console/0.log" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.395354 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.490174 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt"] Oct 08 20:56:00 crc kubenswrapper[4669]: E1008 20:56:00.491072 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" containerName="console" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.491198 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" containerName="console" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.491807 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" containerName="console" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.494265 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.499080 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt"] Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.499440 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505081 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-oauth-config\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505153 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-oauth-serving-cert\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505191 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-config\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505236 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-serving-cert\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505265 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-service-ca\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505286 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8krmk\" (UniqueName: \"kubernetes.io/projected/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-kube-api-access-8krmk\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.505303 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-trusted-ca-bundle\") pod \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\" (UID: \"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9\") " Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.506208 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.506231 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.506220 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-service-ca" (OuterVolumeSpecName: "service-ca") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.506249 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-config" (OuterVolumeSpecName: "console-config") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.510968 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-kube-api-access-8krmk" (OuterVolumeSpecName: "kube-api-access-8krmk") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "kube-api-access-8krmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.511393 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.512829 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" (UID: "4adb69f9-4f03-46c0-bd24-fec72c2b1fd9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606558 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606600 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606625 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnqp\" (UniqueName: \"kubernetes.io/projected/a10fc4a6-e10c-481f-8547-cf5d9669d34d-kube-api-access-zqnqp\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606691 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606710 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8krmk\" (UniqueName: \"kubernetes.io/projected/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-kube-api-access-8krmk\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606720 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606729 4669 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606738 4669 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606749 4669 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.606758 4669 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.708220 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.708279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.708317 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnqp\" (UniqueName: \"kubernetes.io/projected/a10fc4a6-e10c-481f-8547-cf5d9669d34d-kube-api-access-zqnqp\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.708851 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.709187 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.727936 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnqp\" (UniqueName: \"kubernetes.io/projected/a10fc4a6-e10c-481f-8547-cf5d9669d34d-kube-api-access-zqnqp\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.802603 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4m5k5_4adb69f9-4f03-46c0-bd24-fec72c2b1fd9/console/0.log" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.803710 4669 generic.go:334] "Generic (PLEG): container finished" podID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" containerID="4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272" exitCode=2 Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.803842 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m5k5" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.803839 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m5k5" event={"ID":"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9","Type":"ContainerDied","Data":"4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272"} Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.804016 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m5k5" event={"ID":"4adb69f9-4f03-46c0-bd24-fec72c2b1fd9","Type":"ContainerDied","Data":"876acf8b4b48298878be1a417e99d2ff1ad43768c47dbd52266802959c086994"} Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.804051 4669 scope.go:117] "RemoveContainer" containerID="4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.815717 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.832443 4669 scope.go:117] "RemoveContainer" containerID="4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272" Oct 08 20:56:00 crc kubenswrapper[4669]: E1008 20:56:00.833476 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272\": container with ID starting with 4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272 not found: ID does not exist" containerID="4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.833505 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272"} err="failed to get container status \"4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272\": rpc error: code = NotFound desc = could not find container \"4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272\": container with ID starting with 4c312135f9d7149d696c5d46d27f12498d71672f5d5fdc44eea3ba35a3284272 not found: ID does not exist" Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.851686 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4m5k5"] Oct 08 20:56:00 crc kubenswrapper[4669]: I1008 20:56:00.854135 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4m5k5"] Oct 08 20:56:01 crc kubenswrapper[4669]: I1008 20:56:01.126708 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt"] Oct 08 20:56:01 crc kubenswrapper[4669]: I1008 20:56:01.339175 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4adb69f9-4f03-46c0-bd24-fec72c2b1fd9" path="/var/lib/kubelet/pods/4adb69f9-4f03-46c0-bd24-fec72c2b1fd9/volumes" Oct 08 20:56:01 crc kubenswrapper[4669]: I1008 20:56:01.818811 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" event={"ID":"a10fc4a6-e10c-481f-8547-cf5d9669d34d","Type":"ContainerStarted","Data":"282f71a2722b52553c8bcf7858bf462a78a68ae61c861e1788fc60e6f44e5010"} Oct 08 20:56:03 crc kubenswrapper[4669]: I1008 20:56:03.844827 4669 generic.go:334] "Generic (PLEG): container finished" podID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerID="1bcfdc83f37e6cce1be3d67a02b42e7271604dcb1887a59779f27e06c11d3fed" exitCode=0 Oct 08 20:56:03 crc kubenswrapper[4669]: I1008 20:56:03.844900 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" event={"ID":"a10fc4a6-e10c-481f-8547-cf5d9669d34d","Type":"ContainerDied","Data":"1bcfdc83f37e6cce1be3d67a02b42e7271604dcb1887a59779f27e06c11d3fed"} Oct 08 20:56:06 crc kubenswrapper[4669]: I1008 20:56:06.869848 4669 generic.go:334] "Generic (PLEG): container finished" podID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerID="3d5ddb8541e3bd7e73ed7b1232db8e8325e3533db6dba2cbfb4d4f5bbd5c201d" exitCode=0 Oct 08 20:56:06 crc kubenswrapper[4669]: I1008 20:56:06.870090 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" event={"ID":"a10fc4a6-e10c-481f-8547-cf5d9669d34d","Type":"ContainerDied","Data":"3d5ddb8541e3bd7e73ed7b1232db8e8325e3533db6dba2cbfb4d4f5bbd5c201d"} Oct 08 20:56:07 crc kubenswrapper[4669]: I1008 20:56:07.881188 4669 generic.go:334] "Generic (PLEG): container finished" podID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerID="0a2fdbd798cbd0ade9c27f9a5a8abcbef6a54b1aec36314da610c01b21ac9db5" exitCode=0 Oct 08 20:56:07 crc kubenswrapper[4669]: I1008 20:56:07.881268 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" event={"ID":"a10fc4a6-e10c-481f-8547-cf5d9669d34d","Type":"ContainerDied","Data":"0a2fdbd798cbd0ade9c27f9a5a8abcbef6a54b1aec36314da610c01b21ac9db5"} Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.131505 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.237604 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqnqp\" (UniqueName: \"kubernetes.io/projected/a10fc4a6-e10c-481f-8547-cf5d9669d34d-kube-api-access-zqnqp\") pod \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.237675 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-bundle\") pod \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.237733 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-util\") pod \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\" (UID: \"a10fc4a6-e10c-481f-8547-cf5d9669d34d\") " Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.238764 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-bundle" (OuterVolumeSpecName: "bundle") pod "a10fc4a6-e10c-481f-8547-cf5d9669d34d" (UID: "a10fc4a6-e10c-481f-8547-cf5d9669d34d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.247083 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10fc4a6-e10c-481f-8547-cf5d9669d34d-kube-api-access-zqnqp" (OuterVolumeSpecName: "kube-api-access-zqnqp") pod "a10fc4a6-e10c-481f-8547-cf5d9669d34d" (UID: "a10fc4a6-e10c-481f-8547-cf5d9669d34d"). InnerVolumeSpecName "kube-api-access-zqnqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.265254 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-util" (OuterVolumeSpecName: "util") pod "a10fc4a6-e10c-481f-8547-cf5d9669d34d" (UID: "a10fc4a6-e10c-481f-8547-cf5d9669d34d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.339336 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqnqp\" (UniqueName: \"kubernetes.io/projected/a10fc4a6-e10c-481f-8547-cf5d9669d34d-kube-api-access-zqnqp\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.339835 4669 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.339851 4669 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a10fc4a6-e10c-481f-8547-cf5d9669d34d-util\") on node \"crc\" DevicePath \"\"" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.899891 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" event={"ID":"a10fc4a6-e10c-481f-8547-cf5d9669d34d","Type":"ContainerDied","Data":"282f71a2722b52553c8bcf7858bf462a78a68ae61c861e1788fc60e6f44e5010"} Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.899958 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="282f71a2722b52553c8bcf7858bf462a78a68ae61c861e1788fc60e6f44e5010" Oct 08 20:56:09 crc kubenswrapper[4669]: I1008 20:56:09.900292 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt" Oct 08 20:56:13 crc kubenswrapper[4669]: I1008 20:56:13.186676 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:56:13 crc kubenswrapper[4669]: I1008 20:56:13.186961 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:20.999641 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml"] Oct 08 20:56:21 crc kubenswrapper[4669]: E1008 20:56:21.000274 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="pull" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.000285 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="pull" Oct 08 20:56:21 crc kubenswrapper[4669]: E1008 20:56:21.000301 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="extract" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.000307 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="extract" Oct 08 20:56:21 crc kubenswrapper[4669]: E1008 20:56:21.000317 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="util" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.000322 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="util" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.000422 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10fc4a6-e10c-481f-8547-cf5d9669d34d" containerName="extract" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.000762 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.005914 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.006025 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.006052 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.006329 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.012595 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-b8vnk" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.017763 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml"] Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.083161 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhv7\" (UniqueName: \"kubernetes.io/projected/0304e30c-72d1-4544-9f05-fb7acc1c3c61-kube-api-access-gqhv7\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.083204 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0304e30c-72d1-4544-9f05-fb7acc1c3c61-webhook-cert\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.083228 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0304e30c-72d1-4544-9f05-fb7acc1c3c61-apiservice-cert\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.185032 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhv7\" (UniqueName: \"kubernetes.io/projected/0304e30c-72d1-4544-9f05-fb7acc1c3c61-kube-api-access-gqhv7\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.185257 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0304e30c-72d1-4544-9f05-fb7acc1c3c61-webhook-cert\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.185333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0304e30c-72d1-4544-9f05-fb7acc1c3c61-apiservice-cert\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.192327 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0304e30c-72d1-4544-9f05-fb7acc1c3c61-apiservice-cert\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.195991 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0304e30c-72d1-4544-9f05-fb7acc1c3c61-webhook-cert\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.200948 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhv7\" (UniqueName: \"kubernetes.io/projected/0304e30c-72d1-4544-9f05-fb7acc1c3c61-kube-api-access-gqhv7\") pod \"metallb-operator-controller-manager-6f6598d79f-8tfml\" (UID: \"0304e30c-72d1-4544-9f05-fb7acc1c3c61\") " pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.240968 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w"] Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.241660 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.243588 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.243626 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.244073 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5zcgc" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.255056 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w"] Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.286962 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dbe12456-371e-4274-911b-264b27260e4e-apiservice-cert\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.287033 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlpsv\" (UniqueName: \"kubernetes.io/projected/dbe12456-371e-4274-911b-264b27260e4e-kube-api-access-tlpsv\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.287076 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dbe12456-371e-4274-911b-264b27260e4e-webhook-cert\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.315710 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.388125 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dbe12456-371e-4274-911b-264b27260e4e-apiservice-cert\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.388401 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlpsv\" (UniqueName: \"kubernetes.io/projected/dbe12456-371e-4274-911b-264b27260e4e-kube-api-access-tlpsv\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.388452 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dbe12456-371e-4274-911b-264b27260e4e-webhook-cert\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.393508 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dbe12456-371e-4274-911b-264b27260e4e-webhook-cert\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.395747 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dbe12456-371e-4274-911b-264b27260e4e-apiservice-cert\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.409693 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlpsv\" (UniqueName: \"kubernetes.io/projected/dbe12456-371e-4274-911b-264b27260e4e-kube-api-access-tlpsv\") pod \"metallb-operator-webhook-server-5b6f8f8d99-l255w\" (UID: \"dbe12456-371e-4274-911b-264b27260e4e\") " pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.557079 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.742919 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml"] Oct 08 20:56:21 crc kubenswrapper[4669]: W1008 20:56:21.752013 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0304e30c_72d1_4544_9f05_fb7acc1c3c61.slice/crio-62938d1df841467b46f4239c4f3868eb3cbf2969425f8c964a433c93915f1a8d WatchSource:0}: Error finding container 62938d1df841467b46f4239c4f3868eb3cbf2969425f8c964a433c93915f1a8d: Status 404 returned error can't find the container with id 62938d1df841467b46f4239c4f3868eb3cbf2969425f8c964a433c93915f1a8d Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.960991 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" event={"ID":"0304e30c-72d1-4544-9f05-fb7acc1c3c61","Type":"ContainerStarted","Data":"62938d1df841467b46f4239c4f3868eb3cbf2969425f8c964a433c93915f1a8d"} Oct 08 20:56:21 crc kubenswrapper[4669]: I1008 20:56:21.974402 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w"] Oct 08 20:56:21 crc kubenswrapper[4669]: W1008 20:56:21.983815 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbe12456_371e_4274_911b_264b27260e4e.slice/crio-7923e2ac4d9b5db5611b9949ba2ed8536d211201791b478a7713103c69141c5e WatchSource:0}: Error finding container 7923e2ac4d9b5db5611b9949ba2ed8536d211201791b478a7713103c69141c5e: Status 404 returned error can't find the container with id 7923e2ac4d9b5db5611b9949ba2ed8536d211201791b478a7713103c69141c5e Oct 08 20:56:22 crc kubenswrapper[4669]: I1008 20:56:22.972349 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" event={"ID":"dbe12456-371e-4274-911b-264b27260e4e","Type":"ContainerStarted","Data":"7923e2ac4d9b5db5611b9949ba2ed8536d211201791b478a7713103c69141c5e"} Oct 08 20:56:27 crc kubenswrapper[4669]: I1008 20:56:27.007413 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" event={"ID":"0304e30c-72d1-4544-9f05-fb7acc1c3c61","Type":"ContainerStarted","Data":"276594c9fbb0e5cade3ccaea09a73986eb0260d873fc3abdf340bf6fb6157b6e"} Oct 08 20:56:27 crc kubenswrapper[4669]: I1008 20:56:27.008026 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:56:27 crc kubenswrapper[4669]: I1008 20:56:27.033990 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" podStartSLOduration=2.823309672 podStartE2EDuration="7.033969115s" podCreationTimestamp="2025-10-08 20:56:20 +0000 UTC" firstStartedPulling="2025-10-08 20:56:21.755472195 +0000 UTC m=+701.448282868" lastFinishedPulling="2025-10-08 20:56:25.966131638 +0000 UTC m=+705.658942311" observedRunningTime="2025-10-08 20:56:27.031167218 +0000 UTC m=+706.723977881" watchObservedRunningTime="2025-10-08 20:56:27.033969115 +0000 UTC m=+706.726779788" Oct 08 20:56:29 crc kubenswrapper[4669]: I1008 20:56:29.021518 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" event={"ID":"dbe12456-371e-4274-911b-264b27260e4e","Type":"ContainerStarted","Data":"ef28ccb3a1e9e321371950a737e33f541278fe57861cf2756b0a505a03af7418"} Oct 08 20:56:29 crc kubenswrapper[4669]: I1008 20:56:29.021980 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:29 crc kubenswrapper[4669]: I1008 20:56:29.055302 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" podStartSLOduration=1.9795427829999999 podStartE2EDuration="8.055271702s" podCreationTimestamp="2025-10-08 20:56:21 +0000 UTC" firstStartedPulling="2025-10-08 20:56:21.988150449 +0000 UTC m=+701.680961122" lastFinishedPulling="2025-10-08 20:56:28.063879368 +0000 UTC m=+707.756690041" observedRunningTime="2025-10-08 20:56:29.05338984 +0000 UTC m=+708.746200553" watchObservedRunningTime="2025-10-08 20:56:29.055271702 +0000 UTC m=+708.748082415" Oct 08 20:56:41 crc kubenswrapper[4669]: I1008 20:56:41.565148 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b6f8f8d99-l255w" Oct 08 20:56:43 crc kubenswrapper[4669]: I1008 20:56:43.188140 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:56:43 crc kubenswrapper[4669]: I1008 20:56:43.188495 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:56:59 crc kubenswrapper[4669]: I1008 20:56:59.456924 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtqw9"] Oct 08 20:56:59 crc kubenswrapper[4669]: I1008 20:56:59.457778 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" containerName="controller-manager" containerID="cri-o://7560c91b54a48a9683aa74416d42255d4fcf428ec3d56f18577233efcf41e0cb" gracePeriod=30 Oct 08 20:56:59 crc kubenswrapper[4669]: I1008 20:56:59.551466 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t"] Oct 08 20:56:59 crc kubenswrapper[4669]: I1008 20:56:59.551712 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" podUID="2f8ab351-ab50-4289-8d0f-aae3ade74644" containerName="route-controller-manager" containerID="cri-o://9385c8489f9a56f4bcc4d4297cc60610f9749cb657ec6388b553392f3215dc46" gracePeriod=30 Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.205155 4669 generic.go:334] "Generic (PLEG): container finished" podID="2f8ab351-ab50-4289-8d0f-aae3ade74644" containerID="9385c8489f9a56f4bcc4d4297cc60610f9749cb657ec6388b553392f3215dc46" exitCode=0 Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.205573 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" event={"ID":"2f8ab351-ab50-4289-8d0f-aae3ade74644","Type":"ContainerDied","Data":"9385c8489f9a56f4bcc4d4297cc60610f9749cb657ec6388b553392f3215dc46"} Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.207248 4669 generic.go:334] "Generic (PLEG): container finished" podID="0858a203-42e8-4108-a0bd-48ba190bf420" containerID="7560c91b54a48a9683aa74416d42255d4fcf428ec3d56f18577233efcf41e0cb" exitCode=0 Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.207284 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" event={"ID":"0858a203-42e8-4108-a0bd-48ba190bf420","Type":"ContainerDied","Data":"7560c91b54a48a9683aa74416d42255d4fcf428ec3d56f18577233efcf41e0cb"} Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.320330 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.392763 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0858a203-42e8-4108-a0bd-48ba190bf420-serving-cert\") pod \"0858a203-42e8-4108-a0bd-48ba190bf420\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.392851 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-config\") pod \"0858a203-42e8-4108-a0bd-48ba190bf420\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.392915 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2pxl\" (UniqueName: \"kubernetes.io/projected/0858a203-42e8-4108-a0bd-48ba190bf420-kube-api-access-z2pxl\") pod \"0858a203-42e8-4108-a0bd-48ba190bf420\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.392955 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-proxy-ca-bundles\") pod \"0858a203-42e8-4108-a0bd-48ba190bf420\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.392984 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-client-ca\") pod \"0858a203-42e8-4108-a0bd-48ba190bf420\" (UID: \"0858a203-42e8-4108-a0bd-48ba190bf420\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.396047 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0858a203-42e8-4108-a0bd-48ba190bf420" (UID: "0858a203-42e8-4108-a0bd-48ba190bf420"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.396155 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-config" (OuterVolumeSpecName: "config") pod "0858a203-42e8-4108-a0bd-48ba190bf420" (UID: "0858a203-42e8-4108-a0bd-48ba190bf420"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.396209 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-client-ca" (OuterVolumeSpecName: "client-ca") pod "0858a203-42e8-4108-a0bd-48ba190bf420" (UID: "0858a203-42e8-4108-a0bd-48ba190bf420"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.412292 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0858a203-42e8-4108-a0bd-48ba190bf420-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0858a203-42e8-4108-a0bd-48ba190bf420" (UID: "0858a203-42e8-4108-a0bd-48ba190bf420"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.412843 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0858a203-42e8-4108-a0bd-48ba190bf420-kube-api-access-z2pxl" (OuterVolumeSpecName: "kube-api-access-z2pxl") pod "0858a203-42e8-4108-a0bd-48ba190bf420" (UID: "0858a203-42e8-4108-a0bd-48ba190bf420"). InnerVolumeSpecName "kube-api-access-z2pxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.453355 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494071 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st97g\" (UniqueName: \"kubernetes.io/projected/2f8ab351-ab50-4289-8d0f-aae3ade74644-kube-api-access-st97g\") pod \"2f8ab351-ab50-4289-8d0f-aae3ade74644\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494130 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-client-ca\") pod \"2f8ab351-ab50-4289-8d0f-aae3ade74644\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494157 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-config\") pod \"2f8ab351-ab50-4289-8d0f-aae3ade74644\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f8ab351-ab50-4289-8d0f-aae3ade74644-serving-cert\") pod \"2f8ab351-ab50-4289-8d0f-aae3ade74644\" (UID: \"2f8ab351-ab50-4289-8d0f-aae3ade74644\") " Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494442 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0858a203-42e8-4108-a0bd-48ba190bf420-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494456 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494469 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2pxl\" (UniqueName: \"kubernetes.io/projected/0858a203-42e8-4108-a0bd-48ba190bf420-kube-api-access-z2pxl\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494482 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.494495 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0858a203-42e8-4108-a0bd-48ba190bf420-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.495478 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f8ab351-ab50-4289-8d0f-aae3ade74644" (UID: "2f8ab351-ab50-4289-8d0f-aae3ade74644"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.496041 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-config" (OuterVolumeSpecName: "config") pod "2f8ab351-ab50-4289-8d0f-aae3ade74644" (UID: "2f8ab351-ab50-4289-8d0f-aae3ade74644"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.498157 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8ab351-ab50-4289-8d0f-aae3ade74644-kube-api-access-st97g" (OuterVolumeSpecName: "kube-api-access-st97g") pod "2f8ab351-ab50-4289-8d0f-aae3ade74644" (UID: "2f8ab351-ab50-4289-8d0f-aae3ade74644"). InnerVolumeSpecName "kube-api-access-st97g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.500742 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8ab351-ab50-4289-8d0f-aae3ade74644-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f8ab351-ab50-4289-8d0f-aae3ade74644" (UID: "2f8ab351-ab50-4289-8d0f-aae3ade74644"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.595713 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.595758 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f8ab351-ab50-4289-8d0f-aae3ade74644-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.595768 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f8ab351-ab50-4289-8d0f-aae3ade74644-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:00 crc kubenswrapper[4669]: I1008 20:57:00.595777 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st97g\" (UniqueName: \"kubernetes.io/projected/2f8ab351-ab50-4289-8d0f-aae3ade74644-kube-api-access-st97g\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.127556 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th"] Oct 08 20:57:01 crc kubenswrapper[4669]: E1008 20:57:01.128295 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8ab351-ab50-4289-8d0f-aae3ade74644" containerName="route-controller-manager" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.128327 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8ab351-ab50-4289-8d0f-aae3ade74644" containerName="route-controller-manager" Oct 08 20:57:01 crc kubenswrapper[4669]: E1008 20:57:01.128353 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" containerName="controller-manager" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.128367 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" containerName="controller-manager" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.128630 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8ab351-ab50-4289-8d0f-aae3ade74644" containerName="route-controller-manager" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.128668 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" containerName="controller-manager" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.129341 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.131631 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.132463 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.148322 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.151749 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.210317 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffbf4522-d8d9-4f90-8f26-695ae8779aec-serving-cert\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.210410 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-config\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.210617 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-client-ca\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.213244 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-config\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.213343 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg55h\" (UniqueName: \"kubernetes.io/projected/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-kube-api-access-wg55h\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.213416 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-serving-cert\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.213505 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-client-ca\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.213614 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-proxy-ca-bundles\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.213692 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5kt\" (UniqueName: \"kubernetes.io/projected/ffbf4522-d8d9-4f90-8f26-695ae8779aec-kube-api-access-cj5kt\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.218283 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" event={"ID":"0858a203-42e8-4108-a0bd-48ba190bf420","Type":"ContainerDied","Data":"a100fb722eaa96761f6d3ec375efe2257b0f6fa9715c17d4be90cb48925c92ea"} Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.218354 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtqw9" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.218361 4669 scope.go:117] "RemoveContainer" containerID="7560c91b54a48a9683aa74416d42255d4fcf428ec3d56f18577233efcf41e0cb" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.220058 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" event={"ID":"2f8ab351-ab50-4289-8d0f-aae3ade74644","Type":"ContainerDied","Data":"48c0792bc5aa5f2942c84f69ac8e7fd1856442b2ccda97ef080ce73dcc2cda33"} Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.220171 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.244445 4669 scope.go:117] "RemoveContainer" containerID="9385c8489f9a56f4bcc4d4297cc60610f9749cb657ec6388b553392f3215dc46" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.252002 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtqw9"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.259043 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtqw9"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.290753 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.296816 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4xg9t"] Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314440 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-client-ca\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314488 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-proxy-ca-bundles\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314509 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5kt\" (UniqueName: \"kubernetes.io/projected/ffbf4522-d8d9-4f90-8f26-695ae8779aec-kube-api-access-cj5kt\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314548 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffbf4522-d8d9-4f90-8f26-695ae8779aec-serving-cert\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314569 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-config\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-client-ca\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314655 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-config\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314693 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg55h\" (UniqueName: \"kubernetes.io/projected/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-kube-api-access-wg55h\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.314714 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-serving-cert\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.315468 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-client-ca\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.315606 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-client-ca\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.316038 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-config\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.318162 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-config\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.318585 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6f6598d79f-8tfml" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.318854 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffbf4522-d8d9-4f90-8f26-695ae8779aec-proxy-ca-bundles\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.327492 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffbf4522-d8d9-4f90-8f26-695ae8779aec-serving-cert\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.332431 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-serving-cert\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.335256 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg55h\" (UniqueName: \"kubernetes.io/projected/f9fac04b-e2db-4d4a-9e6b-e0a0eb832731-kube-api-access-wg55h\") pod \"route-controller-manager-86b9ff5c44-6fw5w\" (UID: \"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731\") " pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.336258 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5kt\" (UniqueName: \"kubernetes.io/projected/ffbf4522-d8d9-4f90-8f26-695ae8779aec-kube-api-access-cj5kt\") pod \"controller-manager-5cc4dcdb6d-k98th\" (UID: \"ffbf4522-d8d9-4f90-8f26-695ae8779aec\") " pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.348294 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0858a203-42e8-4108-a0bd-48ba190bf420" path="/var/lib/kubelet/pods/0858a203-42e8-4108-a0bd-48ba190bf420/volumes" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.349014 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8ab351-ab50-4289-8d0f-aae3ade74644" path="/var/lib/kubelet/pods/2f8ab351-ab50-4289-8d0f-aae3ade74644/volumes" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.464928 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.490510 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.689998 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th"] Oct 08 20:57:01 crc kubenswrapper[4669]: W1008 20:57:01.699293 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffbf4522_d8d9_4f90_8f26_695ae8779aec.slice/crio-54ecbcb9534bc25e3da85378caa2bc6bc98d717ce49f59a2cd510b0fcc9f7623 WatchSource:0}: Error finding container 54ecbcb9534bc25e3da85378caa2bc6bc98d717ce49f59a2cd510b0fcc9f7623: Status 404 returned error can't find the container with id 54ecbcb9534bc25e3da85378caa2bc6bc98d717ce49f59a2cd510b0fcc9f7623 Oct 08 20:57:01 crc kubenswrapper[4669]: I1008 20:57:01.771159 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.034928 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jq9jv"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.038102 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.039057 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.039713 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.040113 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.042591 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.042909 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-95ptt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.052789 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.057420 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124103 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de0da7b-1591-4af2-bac9-241428020fe9-metrics-certs\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124146 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npnvd\" (UniqueName: \"kubernetes.io/projected/c74dd47d-12ee-4626-a02f-e8dc07f26791-kube-api-access-npnvd\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124348 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-metrics\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124574 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvhr\" (UniqueName: \"kubernetes.io/projected/1de0da7b-1591-4af2-bac9-241428020fe9-kube-api-access-gvvhr\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124637 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-frr-sockets\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124666 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-frr-conf\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124763 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-reloader\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124809 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c74dd47d-12ee-4626-a02f-e8dc07f26791-cert\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.124849 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1de0da7b-1591-4af2-bac9-241428020fe9-frr-startup\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.198971 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-x4jbl"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.199761 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.203305 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.203361 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vwcqg" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.204440 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.205074 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225714 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ph9\" (UniqueName: \"kubernetes.io/projected/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-kube-api-access-28ph9\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225777 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-reloader\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c74dd47d-12ee-4626-a02f-e8dc07f26791-cert\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225848 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1de0da7b-1591-4af2-bac9-241428020fe9-frr-startup\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225897 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de0da7b-1591-4af2-bac9-241428020fe9-metrics-certs\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225925 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npnvd\" (UniqueName: \"kubernetes.io/projected/c74dd47d-12ee-4626-a02f-e8dc07f26791-kube-api-access-npnvd\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.225952 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-metrics\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226010 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metrics-certs\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.226037 4669 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.226090 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1de0da7b-1591-4af2-bac9-241428020fe9-metrics-certs podName:1de0da7b-1591-4af2-bac9-241428020fe9 nodeName:}" failed. No retries permitted until 2025-10-08 20:57:02.726072565 +0000 UTC m=+742.418883238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1de0da7b-1591-4af2-bac9-241428020fe9-metrics-certs") pod "frr-k8s-jq9jv" (UID: "1de0da7b-1591-4af2-bac9-241428020fe9") : secret "frr-k8s-certs-secret" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226045 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvhr\" (UniqueName: \"kubernetes.io/projected/1de0da7b-1591-4af2-bac9-241428020fe9-kube-api-access-gvvhr\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226179 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metallb-excludel2\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226199 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-frr-sockets\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226216 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-frr-conf\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.226352 4669 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226385 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-reloader\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.226401 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74dd47d-12ee-4626-a02f-e8dc07f26791-cert podName:c74dd47d-12ee-4626-a02f-e8dc07f26791 nodeName:}" failed. No retries permitted until 2025-10-08 20:57:02.726386243 +0000 UTC m=+742.419196916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c74dd47d-12ee-4626-a02f-e8dc07f26791-cert") pod "frr-k8s-webhook-server-64bf5d555-gwhr7" (UID: "c74dd47d-12ee-4626-a02f-e8dc07f26791") : secret "frr-k8s-webhook-server-cert" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226385 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-metrics\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226577 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-frr-conf\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226678 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1de0da7b-1591-4af2-bac9-241428020fe9-frr-sockets\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226721 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" event={"ID":"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731","Type":"ContainerStarted","Data":"e2164f16ce0714dc0b39ae70c3b813280765c826fde763629ff99bffdc137763"} Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226771 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" event={"ID":"f9fac04b-e2db-4d4a-9e6b-e0a0eb832731","Type":"ContainerStarted","Data":"9d9f5c2323ae9ebca6bbf1783e60400d484d9f604c0f10b83c4d6bfd3ba39ad6"} Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.226987 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.227011 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1de0da7b-1591-4af2-bac9-241428020fe9-frr-startup\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.228647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" event={"ID":"ffbf4522-d8d9-4f90-8f26-695ae8779aec","Type":"ContainerStarted","Data":"93704754199e3c5263079e7972af723e98b800bed380219c7d041174dd21f52a"} Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.228691 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" event={"ID":"ffbf4522-d8d9-4f90-8f26-695ae8779aec","Type":"ContainerStarted","Data":"54ecbcb9534bc25e3da85378caa2bc6bc98d717ce49f59a2cd510b0fcc9f7623"} Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.228836 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.261054 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-jlbrt"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.261950 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.270807 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.279373 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvhr\" (UniqueName: \"kubernetes.io/projected/1de0da7b-1591-4af2-bac9-241428020fe9-kube-api-access-gvvhr\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.288650 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-jlbrt"] Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.289202 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npnvd\" (UniqueName: \"kubernetes.io/projected/c74dd47d-12ee-4626-a02f-e8dc07f26791-kube-api-access-npnvd\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.318608 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" podStartSLOduration=3.3185937770000002 podStartE2EDuration="3.318593777s" podCreationTimestamp="2025-10-08 20:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:57:02.317892138 +0000 UTC m=+742.010702811" watchObservedRunningTime="2025-10-08 20:57:02.318593777 +0000 UTC m=+742.011404450" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.319762 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.326979 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327018 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metallb-excludel2\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327084 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ph9\" (UniqueName: \"kubernetes.io/projected/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-kube-api-access-28ph9\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327105 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-cert\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327142 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf922\" (UniqueName: \"kubernetes.io/projected/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-kube-api-access-zf922\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.327155 4669 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327176 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-metrics-certs\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.327238 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist podName:91c83ac7-3c70-4fca-85b1-9ee4d9dd4568 nodeName:}" failed. No retries permitted until 2025-10-08 20:57:02.827209084 +0000 UTC m=+742.520019757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist") pod "speaker-x4jbl" (UID: "91c83ac7-3c70-4fca-85b1-9ee4d9dd4568") : secret "metallb-memberlist" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327332 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metrics-certs\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.327451 4669 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.327485 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metrics-certs podName:91c83ac7-3c70-4fca-85b1-9ee4d9dd4568 nodeName:}" failed. No retries permitted until 2025-10-08 20:57:02.827476631 +0000 UTC m=+742.520287294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metrics-certs") pod "speaker-x4jbl" (UID: "91c83ac7-3c70-4fca-85b1-9ee4d9dd4568") : secret "speaker-certs-secret" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.327885 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metallb-excludel2\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.344280 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cc4dcdb6d-k98th" podStartSLOduration=3.344240942 podStartE2EDuration="3.344240942s" podCreationTimestamp="2025-10-08 20:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:57:02.339300386 +0000 UTC m=+742.032111059" watchObservedRunningTime="2025-10-08 20:57:02.344240942 +0000 UTC m=+742.037051615" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.358963 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ph9\" (UniqueName: \"kubernetes.io/projected/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-kube-api-access-28ph9\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.428318 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-cert\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.428381 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf922\" (UniqueName: \"kubernetes.io/projected/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-kube-api-access-zf922\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.428423 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-metrics-certs\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.429361 4669 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.429440 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-metrics-certs podName:80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6 nodeName:}" failed. No retries permitted until 2025-10-08 20:57:02.929421864 +0000 UTC m=+742.622232527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-metrics-certs") pod "controller-68d546b9d8-jlbrt" (UID: "80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6") : secret "controller-certs-secret" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.431513 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.446957 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-cert\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.462121 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf922\" (UniqueName: \"kubernetes.io/projected/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-kube-api-access-zf922\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.635090 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86b9ff5c44-6fw5w" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.732764 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c74dd47d-12ee-4626-a02f-e8dc07f26791-cert\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.732818 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de0da7b-1591-4af2-bac9-241428020fe9-metrics-certs\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.736784 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c74dd47d-12ee-4626-a02f-e8dc07f26791-cert\") pod \"frr-k8s-webhook-server-64bf5d555-gwhr7\" (UID: \"c74dd47d-12ee-4626-a02f-e8dc07f26791\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.736870 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de0da7b-1591-4af2-bac9-241428020fe9-metrics-certs\") pod \"frr-k8s-jq9jv\" (UID: \"1de0da7b-1591-4af2-bac9-241428020fe9\") " pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.833807 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metrics-certs\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.834042 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.834179 4669 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 20:57:02 crc kubenswrapper[4669]: E1008 20:57:02.834255 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist podName:91c83ac7-3c70-4fca-85b1-9ee4d9dd4568 nodeName:}" failed. No retries permitted until 2025-10-08 20:57:03.834238968 +0000 UTC m=+743.527049641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist") pod "speaker-x4jbl" (UID: "91c83ac7-3c70-4fca-85b1-9ee4d9dd4568") : secret "metallb-memberlist" not found Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.836723 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-metrics-certs\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.936041 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-metrics-certs\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.939906 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6-metrics-certs\") pod \"controller-68d546b9d8-jlbrt\" (UID: \"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6\") " pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.951812 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:02 crc kubenswrapper[4669]: I1008 20:57:02.962658 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:03 crc kubenswrapper[4669]: I1008 20:57:03.176871 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:03 crc kubenswrapper[4669]: I1008 20:57:03.228592 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7"] Oct 08 20:57:03 crc kubenswrapper[4669]: I1008 20:57:03.235082 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"ac8020c9a8508a7831b72bb4733a34166a38b28d02e38435412049520301fa90"} Oct 08 20:57:03 crc kubenswrapper[4669]: W1008 20:57:03.238750 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc74dd47d_12ee_4626_a02f_e8dc07f26791.slice/crio-f9c61677ba5c4fc0db945f2e3210a181b7189c134d5c2dbd7274243221861a10 WatchSource:0}: Error finding container f9c61677ba5c4fc0db945f2e3210a181b7189c134d5c2dbd7274243221861a10: Status 404 returned error can't find the container with id f9c61677ba5c4fc0db945f2e3210a181b7189c134d5c2dbd7274243221861a10 Oct 08 20:57:03 crc kubenswrapper[4669]: I1008 20:57:03.366772 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-jlbrt"] Oct 08 20:57:03 crc kubenswrapper[4669]: W1008 20:57:03.373648 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c87dc9_d3cb_455c_b0bc_c9ae5a0cead6.slice/crio-638ac9fad3ba0ae6d27647e482934aa6d01dd6245986551b8a645b612729adf9 WatchSource:0}: Error finding container 638ac9fad3ba0ae6d27647e482934aa6d01dd6245986551b8a645b612729adf9: Status 404 returned error can't find the container with id 638ac9fad3ba0ae6d27647e482934aa6d01dd6245986551b8a645b612729adf9 Oct 08 20:57:03 crc kubenswrapper[4669]: I1008 20:57:03.849415 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:03 crc kubenswrapper[4669]: I1008 20:57:03.856977 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/91c83ac7-3c70-4fca-85b1-9ee4d9dd4568-memberlist\") pod \"speaker-x4jbl\" (UID: \"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568\") " pod="metallb-system/speaker-x4jbl" Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.013695 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x4jbl" Oct 08 20:57:04 crc kubenswrapper[4669]: W1008 20:57:04.036783 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c83ac7_3c70_4fca_85b1_9ee4d9dd4568.slice/crio-da4e4639ea8c7cd9fcb084d6b53e63f6d23456cca141b83a6e9aa6c258c0f8f3 WatchSource:0}: Error finding container da4e4639ea8c7cd9fcb084d6b53e63f6d23456cca141b83a6e9aa6c258c0f8f3: Status 404 returned error can't find the container with id da4e4639ea8c7cd9fcb084d6b53e63f6d23456cca141b83a6e9aa6c258c0f8f3 Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.241219 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x4jbl" event={"ID":"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568","Type":"ContainerStarted","Data":"da4e4639ea8c7cd9fcb084d6b53e63f6d23456cca141b83a6e9aa6c258c0f8f3"} Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.242041 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" event={"ID":"c74dd47d-12ee-4626-a02f-e8dc07f26791","Type":"ContainerStarted","Data":"f9c61677ba5c4fc0db945f2e3210a181b7189c134d5c2dbd7274243221861a10"} Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.243337 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-jlbrt" event={"ID":"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6","Type":"ContainerStarted","Data":"6593354d92e87962abf99e3d11e3a57b8d4c5ddaf4ecb3f62ab76d31e1ade141"} Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.243365 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-jlbrt" event={"ID":"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6","Type":"ContainerStarted","Data":"aedeae6c5f0bb90045d24ba04279b9188f2c5327b9fb89c4c502d4548ccb6493"} Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.243379 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-jlbrt" event={"ID":"80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6","Type":"ContainerStarted","Data":"638ac9fad3ba0ae6d27647e482934aa6d01dd6245986551b8a645b612729adf9"} Oct 08 20:57:04 crc kubenswrapper[4669]: I1008 20:57:04.277513 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-jlbrt" podStartSLOduration=2.27749288 podStartE2EDuration="2.27749288s" podCreationTimestamp="2025-10-08 20:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:57:04.27418715 +0000 UTC m=+743.966997833" watchObservedRunningTime="2025-10-08 20:57:04.27749288 +0000 UTC m=+743.970303553" Oct 08 20:57:05 crc kubenswrapper[4669]: I1008 20:57:05.273434 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x4jbl" event={"ID":"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568","Type":"ContainerStarted","Data":"447c435093dc5fd2d37c582a3bab0c06d81df5a252741bfeb748298afea48db2"} Oct 08 20:57:05 crc kubenswrapper[4669]: I1008 20:57:05.273703 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:05 crc kubenswrapper[4669]: I1008 20:57:05.273716 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x4jbl" event={"ID":"91c83ac7-3c70-4fca-85b1-9ee4d9dd4568","Type":"ContainerStarted","Data":"f5e8a2bbb8e689d0ea0b733c88c29f5a50c77ac0028e0985846ec0cae20c3355"} Oct 08 20:57:05 crc kubenswrapper[4669]: I1008 20:57:05.273745 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-x4jbl" Oct 08 20:57:07 crc kubenswrapper[4669]: I1008 20:57:07.283962 4669 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 20:57:11 crc kubenswrapper[4669]: I1008 20:57:11.312751 4669 generic.go:334] "Generic (PLEG): container finished" podID="1de0da7b-1591-4af2-bac9-241428020fe9" containerID="3de73eb45fc9e6912c7fd28eacaf56f3407fa89a3d2211f35a6d17fc2739c3eb" exitCode=0 Oct 08 20:57:11 crc kubenswrapper[4669]: I1008 20:57:11.312875 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerDied","Data":"3de73eb45fc9e6912c7fd28eacaf56f3407fa89a3d2211f35a6d17fc2739c3eb"} Oct 08 20:57:11 crc kubenswrapper[4669]: I1008 20:57:11.316000 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" event={"ID":"c74dd47d-12ee-4626-a02f-e8dc07f26791","Type":"ContainerStarted","Data":"2a75c26a275b370ea576ae2723b555db3535d90053a2a07146d37c4e12bcba97"} Oct 08 20:57:11 crc kubenswrapper[4669]: I1008 20:57:11.316229 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:11 crc kubenswrapper[4669]: I1008 20:57:11.352871 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-x4jbl" podStartSLOduration=9.35284818 podStartE2EDuration="9.35284818s" podCreationTimestamp="2025-10-08 20:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:57:05.293216774 +0000 UTC m=+744.986027467" watchObservedRunningTime="2025-10-08 20:57:11.35284818 +0000 UTC m=+751.045658883" Oct 08 20:57:11 crc kubenswrapper[4669]: I1008 20:57:11.401931 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" podStartSLOduration=1.9891060760000001 podStartE2EDuration="9.401902248s" podCreationTimestamp="2025-10-08 20:57:02 +0000 UTC" firstStartedPulling="2025-10-08 20:57:03.242083987 +0000 UTC m=+742.934894660" lastFinishedPulling="2025-10-08 20:57:10.654880129 +0000 UTC m=+750.347690832" observedRunningTime="2025-10-08 20:57:11.39505657 +0000 UTC m=+751.087867283" watchObservedRunningTime="2025-10-08 20:57:11.401902248 +0000 UTC m=+751.094712971" Oct 08 20:57:12 crc kubenswrapper[4669]: I1008 20:57:12.328858 4669 generic.go:334] "Generic (PLEG): container finished" podID="1de0da7b-1591-4af2-bac9-241428020fe9" containerID="01191a63d4de5c68814b1ed9e2bd7d0ee3a05e4c85d1e78b1b3dd165b04f34a1" exitCode=0 Oct 08 20:57:12 crc kubenswrapper[4669]: I1008 20:57:12.329452 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerDied","Data":"01191a63d4de5c68814b1ed9e2bd7d0ee3a05e4c85d1e78b1b3dd165b04f34a1"} Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.181147 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-jlbrt" Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.186095 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.186178 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.186253 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.187004 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4988e2f99ae9422660aeb112dbeb7f72ef85e0a64c0c7a60db05121da7a422d0"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.187099 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://4988e2f99ae9422660aeb112dbeb7f72ef85e0a64c0c7a60db05121da7a422d0" gracePeriod=600 Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.344474 4669 generic.go:334] "Generic (PLEG): container finished" podID="1de0da7b-1591-4af2-bac9-241428020fe9" containerID="381204e4a7d0f8732ad7952e833622bd262108db50b5be1fff27f1a45a677b08" exitCode=0 Oct 08 20:57:13 crc kubenswrapper[4669]: I1008 20:57:13.344509 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerDied","Data":"381204e4a7d0f8732ad7952e833622bd262108db50b5be1fff27f1a45a677b08"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.019008 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-x4jbl" Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.358440 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="4988e2f99ae9422660aeb112dbeb7f72ef85e0a64c0c7a60db05121da7a422d0" exitCode=0 Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.358514 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"4988e2f99ae9422660aeb112dbeb7f72ef85e0a64c0c7a60db05121da7a422d0"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.358701 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"e212469b959f799f6dd101756cbc798d4bd5c61d90207df29dcf3db6ccbd05d1"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.358723 4669 scope.go:117] "RemoveContainer" containerID="c6d89efc3b8d912824669f2434ad38318f78ba91caa1db76769d7947e2583b0f" Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.367247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"df0d73a76bc99de4e789511fc2334ee55c5b8c2a4d863eb3936f89aa1623eb9c"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.367280 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"1cd52c976c69ad76b47acf39e2fba582fdb1530cbc083294e957cff85a9c7beb"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.367288 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"4f6ae8b3abc9482a95c6af6c50d7c591fb6a1bdc52911e57a267922fe4e2f6f9"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.367298 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"644b6681fa38c02dffe6452bee186ca6d94d3ef44c3cc95ebf46a63498d8cb4a"} Oct 08 20:57:14 crc kubenswrapper[4669]: I1008 20:57:14.367307 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"670d971ec17c1220d472e5f2d55f51bc94f9ee816c3f31b791187896aaca2f57"} Oct 08 20:57:15 crc kubenswrapper[4669]: I1008 20:57:15.400247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jq9jv" event={"ID":"1de0da7b-1591-4af2-bac9-241428020fe9","Type":"ContainerStarted","Data":"1235df847764d3634bbf0911ead79894448a6bd76c2f368cb2957ec1a946e325"} Oct 08 20:57:15 crc kubenswrapper[4669]: I1008 20:57:15.400817 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:15 crc kubenswrapper[4669]: I1008 20:57:15.429120 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jq9jv" podStartSLOduration=5.969733608 podStartE2EDuration="13.429103641s" podCreationTimestamp="2025-10-08 20:57:02 +0000 UTC" firstStartedPulling="2025-10-08 20:57:03.214705784 +0000 UTC m=+742.907516457" lastFinishedPulling="2025-10-08 20:57:10.674075777 +0000 UTC m=+750.366886490" observedRunningTime="2025-10-08 20:57:15.42728794 +0000 UTC m=+755.120098653" watchObservedRunningTime="2025-10-08 20:57:15.429103641 +0000 UTC m=+755.121914324" Oct 08 20:57:16 crc kubenswrapper[4669]: I1008 20:57:16.871100 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-txfxj"] Oct 08 20:57:16 crc kubenswrapper[4669]: I1008 20:57:16.872921 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:16 crc kubenswrapper[4669]: I1008 20:57:16.876561 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 08 20:57:16 crc kubenswrapper[4669]: I1008 20:57:16.876679 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 08 20:57:16 crc kubenswrapper[4669]: I1008 20:57:16.923280 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-txfxj"] Oct 08 20:57:16 crc kubenswrapper[4669]: I1008 20:57:16.929291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgpjl\" (UniqueName: \"kubernetes.io/projected/9cb27ddd-5007-40bd-9bef-7d80a6d44de8-kube-api-access-wgpjl\") pod \"openstack-operator-index-txfxj\" (UID: \"9cb27ddd-5007-40bd-9bef-7d80a6d44de8\") " pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:17 crc kubenswrapper[4669]: I1008 20:57:17.030101 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgpjl\" (UniqueName: \"kubernetes.io/projected/9cb27ddd-5007-40bd-9bef-7d80a6d44de8-kube-api-access-wgpjl\") pod \"openstack-operator-index-txfxj\" (UID: \"9cb27ddd-5007-40bd-9bef-7d80a6d44de8\") " pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:17 crc kubenswrapper[4669]: I1008 20:57:17.055987 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgpjl\" (UniqueName: \"kubernetes.io/projected/9cb27ddd-5007-40bd-9bef-7d80a6d44de8-kube-api-access-wgpjl\") pod \"openstack-operator-index-txfxj\" (UID: \"9cb27ddd-5007-40bd-9bef-7d80a6d44de8\") " pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:17 crc kubenswrapper[4669]: I1008 20:57:17.191790 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:17 crc kubenswrapper[4669]: I1008 20:57:17.633643 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-txfxj"] Oct 08 20:57:17 crc kubenswrapper[4669]: I1008 20:57:17.953470 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:18 crc kubenswrapper[4669]: I1008 20:57:18.047637 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:18 crc kubenswrapper[4669]: I1008 20:57:18.419362 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txfxj" event={"ID":"9cb27ddd-5007-40bd-9bef-7d80a6d44de8","Type":"ContainerStarted","Data":"10a018fdfaa79de59f499ecb7d5eb3e874ff7d6ca9c321b96a2038de7b311fca"} Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.218810 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-txfxj"] Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.443437 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txfxj" event={"ID":"9cb27ddd-5007-40bd-9bef-7d80a6d44de8","Type":"ContainerStarted","Data":"d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295"} Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.469340 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-txfxj" podStartSLOduration=2.32740746 podStartE2EDuration="5.469288982s" podCreationTimestamp="2025-10-08 20:57:16 +0000 UTC" firstStartedPulling="2025-10-08 20:57:17.633099669 +0000 UTC m=+757.325910352" lastFinishedPulling="2025-10-08 20:57:20.774981191 +0000 UTC m=+760.467791874" observedRunningTime="2025-10-08 20:57:21.463653117 +0000 UTC m=+761.156463880" watchObservedRunningTime="2025-10-08 20:57:21.469288982 +0000 UTC m=+761.162099695" Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.840260 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzvc5"] Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.842312 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.859350 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzvc5"] Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.997762 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-utilities\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.997822 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j628\" (UniqueName: \"kubernetes.io/projected/9d035d7e-48d7-4662-b62e-2ae23d54c8db-kube-api-access-8j628\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:21 crc kubenswrapper[4669]: I1008 20:57:21.997914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-catalog-content\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.024029 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hd2hf"] Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.025116 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.027395 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-q9bdg" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.033062 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hd2hf"] Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.098865 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-utilities\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.098927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j628\" (UniqueName: \"kubernetes.io/projected/9d035d7e-48d7-4662-b62e-2ae23d54c8db-kube-api-access-8j628\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.098979 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2565l\" (UniqueName: \"kubernetes.io/projected/40ec54d4-3186-4a11-a533-b7edc48914b3-kube-api-access-2565l\") pod \"openstack-operator-index-hd2hf\" (UID: \"40ec54d4-3186-4a11-a533-b7edc48914b3\") " pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.099043 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-catalog-content\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.099303 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-utilities\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.099421 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-catalog-content\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.124100 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j628\" (UniqueName: \"kubernetes.io/projected/9d035d7e-48d7-4662-b62e-2ae23d54c8db-kube-api-access-8j628\") pod \"community-operators-pzvc5\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.185956 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.199923 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2565l\" (UniqueName: \"kubernetes.io/projected/40ec54d4-3186-4a11-a533-b7edc48914b3-kube-api-access-2565l\") pod \"openstack-operator-index-hd2hf\" (UID: \"40ec54d4-3186-4a11-a533-b7edc48914b3\") " pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.216463 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2565l\" (UniqueName: \"kubernetes.io/projected/40ec54d4-3186-4a11-a533-b7edc48914b3-kube-api-access-2565l\") pod \"openstack-operator-index-hd2hf\" (UID: \"40ec54d4-3186-4a11-a533-b7edc48914b3\") " pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.342321 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.450733 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-txfxj" podUID="9cb27ddd-5007-40bd-9bef-7d80a6d44de8" containerName="registry-server" containerID="cri-o://d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295" gracePeriod=2 Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.672932 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzvc5"] Oct 08 20:57:22 crc kubenswrapper[4669]: W1008 20:57:22.674807 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d035d7e_48d7_4662_b62e_2ae23d54c8db.slice/crio-5aaa10db7f349205a5d78941e001d7530da8c0f7d96ed2a9607dae6aa4693bd4 WatchSource:0}: Error finding container 5aaa10db7f349205a5d78941e001d7530da8c0f7d96ed2a9607dae6aa4693bd4: Status 404 returned error can't find the container with id 5aaa10db7f349205a5d78941e001d7530da8c0f7d96ed2a9607dae6aa4693bd4 Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.761509 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hd2hf"] Oct 08 20:57:22 crc kubenswrapper[4669]: W1008 20:57:22.767821 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40ec54d4_3186_4a11_a533_b7edc48914b3.slice/crio-70d7222162eb57295446315e5bd2b4b9af8843ec4f00332402d04f166c8c36e8 WatchSource:0}: Error finding container 70d7222162eb57295446315e5bd2b4b9af8843ec4f00332402d04f166c8c36e8: Status 404 returned error can't find the container with id 70d7222162eb57295446315e5bd2b4b9af8843ec4f00332402d04f166c8c36e8 Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.970271 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-gwhr7" Oct 08 20:57:22 crc kubenswrapper[4669]: I1008 20:57:22.998725 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.016660 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgpjl\" (UniqueName: \"kubernetes.io/projected/9cb27ddd-5007-40bd-9bef-7d80a6d44de8-kube-api-access-wgpjl\") pod \"9cb27ddd-5007-40bd-9bef-7d80a6d44de8\" (UID: \"9cb27ddd-5007-40bd-9bef-7d80a6d44de8\") " Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.034688 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb27ddd-5007-40bd-9bef-7d80a6d44de8-kube-api-access-wgpjl" (OuterVolumeSpecName: "kube-api-access-wgpjl") pod "9cb27ddd-5007-40bd-9bef-7d80a6d44de8" (UID: "9cb27ddd-5007-40bd-9bef-7d80a6d44de8"). InnerVolumeSpecName "kube-api-access-wgpjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.117803 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgpjl\" (UniqueName: \"kubernetes.io/projected/9cb27ddd-5007-40bd-9bef-7d80a6d44de8-kube-api-access-wgpjl\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.459748 4669 generic.go:334] "Generic (PLEG): container finished" podID="9cb27ddd-5007-40bd-9bef-7d80a6d44de8" containerID="d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295" exitCode=0 Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.459857 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txfxj" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.459874 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txfxj" event={"ID":"9cb27ddd-5007-40bd-9bef-7d80a6d44de8","Type":"ContainerDied","Data":"d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295"} Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.460512 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txfxj" event={"ID":"9cb27ddd-5007-40bd-9bef-7d80a6d44de8","Type":"ContainerDied","Data":"10a018fdfaa79de59f499ecb7d5eb3e874ff7d6ca9c321b96a2038de7b311fca"} Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.460568 4669 scope.go:117] "RemoveContainer" containerID="d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.466152 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hd2hf" event={"ID":"40ec54d4-3186-4a11-a533-b7edc48914b3","Type":"ContainerStarted","Data":"228a952ad03249af40439bf2f3bc22b137a953f2dcab46d87f1883aabd48b0ac"} Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.466216 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hd2hf" event={"ID":"40ec54d4-3186-4a11-a533-b7edc48914b3","Type":"ContainerStarted","Data":"70d7222162eb57295446315e5bd2b4b9af8843ec4f00332402d04f166c8c36e8"} Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.471894 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerID="2816ae880fa482ec9e84a34be5dba068052ad53555d86eae1b0448ec624e8517" exitCode=0 Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.472559 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvc5" event={"ID":"9d035d7e-48d7-4662-b62e-2ae23d54c8db","Type":"ContainerDied","Data":"2816ae880fa482ec9e84a34be5dba068052ad53555d86eae1b0448ec624e8517"} Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.472632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvc5" event={"ID":"9d035d7e-48d7-4662-b62e-2ae23d54c8db","Type":"ContainerStarted","Data":"5aaa10db7f349205a5d78941e001d7530da8c0f7d96ed2a9607dae6aa4693bd4"} Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.503157 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-txfxj"] Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.506060 4669 scope.go:117] "RemoveContainer" containerID="d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295" Oct 08 20:57:23 crc kubenswrapper[4669]: E1008 20:57:23.507014 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295\": container with ID starting with d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295 not found: ID does not exist" containerID="d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.507097 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295"} err="failed to get container status \"d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295\": rpc error: code = NotFound desc = could not find container \"d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295\": container with ID starting with d5035bdf32c2fb192387e197b217d66be4b83cbbf1c2ce060303fb5659701295 not found: ID does not exist" Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.512595 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-txfxj"] Oct 08 20:57:23 crc kubenswrapper[4669]: I1008 20:57:23.515571 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hd2hf" podStartSLOduration=1.461735547 podStartE2EDuration="1.515551736s" podCreationTimestamp="2025-10-08 20:57:22 +0000 UTC" firstStartedPulling="2025-10-08 20:57:22.773201805 +0000 UTC m=+762.466012488" lastFinishedPulling="2025-10-08 20:57:22.827017994 +0000 UTC m=+762.519828677" observedRunningTime="2025-10-08 20:57:23.511259057 +0000 UTC m=+763.204069730" watchObservedRunningTime="2025-10-08 20:57:23.515551736 +0000 UTC m=+763.208362429" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.343949 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb27ddd-5007-40bd-9bef-7d80a6d44de8" path="/var/lib/kubelet/pods/9cb27ddd-5007-40bd-9bef-7d80a6d44de8/volumes" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.830969 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jjfnk"] Oct 08 20:57:25 crc kubenswrapper[4669]: E1008 20:57:25.831217 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb27ddd-5007-40bd-9bef-7d80a6d44de8" containerName="registry-server" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.831228 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb27ddd-5007-40bd-9bef-7d80a6d44de8" containerName="registry-server" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.831350 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb27ddd-5007-40bd-9bef-7d80a6d44de8" containerName="registry-server" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.832073 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.844290 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjfnk"] Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.857149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-utilities\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.857422 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9nw\" (UniqueName: \"kubernetes.io/projected/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-kube-api-access-7r9nw\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.857574 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-catalog-content\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.958429 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-catalog-content\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.958518 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-utilities\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.958604 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9nw\" (UniqueName: \"kubernetes.io/projected/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-kube-api-access-7r9nw\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.958995 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-catalog-content\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.959285 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-utilities\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:25 crc kubenswrapper[4669]: I1008 20:57:25.983814 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9nw\" (UniqueName: \"kubernetes.io/projected/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-kube-api-access-7r9nw\") pod \"redhat-operators-jjfnk\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:26 crc kubenswrapper[4669]: I1008 20:57:26.152234 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:26 crc kubenswrapper[4669]: I1008 20:57:26.490945 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerID="a80de1c62fdebe0dec773a6c6760b3b4981cd3936902e376142a69dd2c09d56d" exitCode=0 Oct 08 20:57:26 crc kubenswrapper[4669]: I1008 20:57:26.490998 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvc5" event={"ID":"9d035d7e-48d7-4662-b62e-2ae23d54c8db","Type":"ContainerDied","Data":"a80de1c62fdebe0dec773a6c6760b3b4981cd3936902e376142a69dd2c09d56d"} Oct 08 20:57:26 crc kubenswrapper[4669]: I1008 20:57:26.557595 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjfnk"] Oct 08 20:57:26 crc kubenswrapper[4669]: W1008 20:57:26.561579 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6b1b66_95ad_45a2_a0ea_ab15dde11143.slice/crio-baae500655bcffbb1b7f50c128cfc1883868f6fdb784840c23f0e6f6be23acd1 WatchSource:0}: Error finding container baae500655bcffbb1b7f50c128cfc1883868f6fdb784840c23f0e6f6be23acd1: Status 404 returned error can't find the container with id baae500655bcffbb1b7f50c128cfc1883868f6fdb784840c23f0e6f6be23acd1 Oct 08 20:57:27 crc kubenswrapper[4669]: I1008 20:57:27.499614 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerID="ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c" exitCode=0 Oct 08 20:57:27 crc kubenswrapper[4669]: I1008 20:57:27.499685 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerDied","Data":"ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c"} Oct 08 20:57:27 crc kubenswrapper[4669]: I1008 20:57:27.499999 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerStarted","Data":"baae500655bcffbb1b7f50c128cfc1883868f6fdb784840c23f0e6f6be23acd1"} Oct 08 20:57:27 crc kubenswrapper[4669]: I1008 20:57:27.503116 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvc5" event={"ID":"9d035d7e-48d7-4662-b62e-2ae23d54c8db","Type":"ContainerStarted","Data":"499c0f731e0c44f45c531f061b46c809ccb49c684b46d9173bce07872864b3cc"} Oct 08 20:57:27 crc kubenswrapper[4669]: I1008 20:57:27.544862 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzvc5" podStartSLOduration=3.08739274 podStartE2EDuration="6.544844135s" podCreationTimestamp="2025-10-08 20:57:21 +0000 UTC" firstStartedPulling="2025-10-08 20:57:23.47386983 +0000 UTC m=+763.166680543" lastFinishedPulling="2025-10-08 20:57:26.931321275 +0000 UTC m=+766.624131938" observedRunningTime="2025-10-08 20:57:27.541125714 +0000 UTC m=+767.233936397" watchObservedRunningTime="2025-10-08 20:57:27.544844135 +0000 UTC m=+767.237654818" Oct 08 20:57:28 crc kubenswrapper[4669]: I1008 20:57:28.514567 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerStarted","Data":"6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236"} Oct 08 20:57:29 crc kubenswrapper[4669]: I1008 20:57:29.522207 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerID="6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236" exitCode=0 Oct 08 20:57:29 crc kubenswrapper[4669]: I1008 20:57:29.522255 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerDied","Data":"6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236"} Oct 08 20:57:30 crc kubenswrapper[4669]: I1008 20:57:30.531446 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerStarted","Data":"4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24"} Oct 08 20:57:30 crc kubenswrapper[4669]: I1008 20:57:30.556868 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jjfnk" podStartSLOduration=3.123706943 podStartE2EDuration="5.556839669s" podCreationTimestamp="2025-10-08 20:57:25 +0000 UTC" firstStartedPulling="2025-10-08 20:57:27.501293048 +0000 UTC m=+767.194103721" lastFinishedPulling="2025-10-08 20:57:29.934425774 +0000 UTC m=+769.627236447" observedRunningTime="2025-10-08 20:57:30.553495997 +0000 UTC m=+770.246306680" watchObservedRunningTime="2025-10-08 20:57:30.556839669 +0000 UTC m=+770.249650342" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.186606 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.186679 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.250422 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.342569 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.342895 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.389441 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.589272 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hd2hf" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.610787 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:32 crc kubenswrapper[4669]: I1008 20:57:32.956829 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jq9jv" Oct 08 20:57:36 crc kubenswrapper[4669]: I1008 20:57:36.153220 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:36 crc kubenswrapper[4669]: I1008 20:57:36.153821 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:36 crc kubenswrapper[4669]: I1008 20:57:36.205434 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:36 crc kubenswrapper[4669]: I1008 20:57:36.640696 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.085148 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh"] Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.086972 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.096072 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tp72r" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.102432 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh"] Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.215610 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzvc5"] Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.215975 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzvc5" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="registry-server" containerID="cri-o://499c0f731e0c44f45c531f061b46c809ccb49c684b46d9173bce07872864b3cc" gracePeriod=2 Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.237359 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4pg5\" (UniqueName: \"kubernetes.io/projected/a6266c66-d403-46e9-ad99-54beedd6adf4-kube-api-access-h4pg5\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.237425 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-util\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.237474 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-bundle\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.338476 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-bundle\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.338663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4pg5\" (UniqueName: \"kubernetes.io/projected/a6266c66-d403-46e9-ad99-54beedd6adf4-kube-api-access-h4pg5\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.338727 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-util\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.338987 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-bundle\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.339393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-util\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.359325 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4pg5\" (UniqueName: \"kubernetes.io/projected/a6266c66-d403-46e9-ad99-54beedd6adf4-kube-api-access-h4pg5\") pod \"7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.411541 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.606158 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerID="499c0f731e0c44f45c531f061b46c809ccb49c684b46d9173bce07872864b3cc" exitCode=0 Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.606613 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvc5" event={"ID":"9d035d7e-48d7-4662-b62e-2ae23d54c8db","Type":"ContainerDied","Data":"499c0f731e0c44f45c531f061b46c809ccb49c684b46d9173bce07872864b3cc"} Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.618112 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjfnk"] Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.720591 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.747860 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-utilities\") pod \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.747928 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j628\" (UniqueName: \"kubernetes.io/projected/9d035d7e-48d7-4662-b62e-2ae23d54c8db-kube-api-access-8j628\") pod \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.748131 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-catalog-content\") pod \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\" (UID: \"9d035d7e-48d7-4662-b62e-2ae23d54c8db\") " Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.748734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-utilities" (OuterVolumeSpecName: "utilities") pod "9d035d7e-48d7-4662-b62e-2ae23d54c8db" (UID: "9d035d7e-48d7-4662-b62e-2ae23d54c8db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.751961 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d035d7e-48d7-4662-b62e-2ae23d54c8db-kube-api-access-8j628" (OuterVolumeSpecName: "kube-api-access-8j628") pod "9d035d7e-48d7-4662-b62e-2ae23d54c8db" (UID: "9d035d7e-48d7-4662-b62e-2ae23d54c8db"). InnerVolumeSpecName "kube-api-access-8j628". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.812878 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d035d7e-48d7-4662-b62e-2ae23d54c8db" (UID: "9d035d7e-48d7-4662-b62e-2ae23d54c8db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.849461 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.849506 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d035d7e-48d7-4662-b62e-2ae23d54c8db-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.849516 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j628\" (UniqueName: \"kubernetes.io/projected/9d035d7e-48d7-4662-b62e-2ae23d54c8db-kube-api-access-8j628\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:37 crc kubenswrapper[4669]: I1008 20:57:37.901673 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh"] Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.614125 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvc5" event={"ID":"9d035d7e-48d7-4662-b62e-2ae23d54c8db","Type":"ContainerDied","Data":"5aaa10db7f349205a5d78941e001d7530da8c0f7d96ed2a9607dae6aa4693bd4"} Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.614495 4669 scope.go:117] "RemoveContainer" containerID="499c0f731e0c44f45c531f061b46c809ccb49c684b46d9173bce07872864b3cc" Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.614151 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvc5" Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.616117 4669 generic.go:334] "Generic (PLEG): container finished" podID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerID="ad9cf36e44ea3210d4291efa0051252b0614f3332c7c6d253fe2fc2819701f3b" exitCode=0 Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.616358 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jjfnk" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="registry-server" containerID="cri-o://4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24" gracePeriod=2 Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.616450 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" event={"ID":"a6266c66-d403-46e9-ad99-54beedd6adf4","Type":"ContainerDied","Data":"ad9cf36e44ea3210d4291efa0051252b0614f3332c7c6d253fe2fc2819701f3b"} Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.616483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" event={"ID":"a6266c66-d403-46e9-ad99-54beedd6adf4","Type":"ContainerStarted","Data":"45f47d34b2d9b41509ed0af392394757bb3d11cdd6d7e1ab69d078ebfa1f2920"} Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.644724 4669 scope.go:117] "RemoveContainer" containerID="a80de1c62fdebe0dec773a6c6760b3b4981cd3936902e376142a69dd2c09d56d" Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.664110 4669 scope.go:117] "RemoveContainer" containerID="2816ae880fa482ec9e84a34be5dba068052ad53555d86eae1b0448ec624e8517" Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.666152 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzvc5"] Oct 08 20:57:38 crc kubenswrapper[4669]: I1008 20:57:38.673356 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzvc5"] Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.047013 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.164135 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-utilities\") pod \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.164262 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-catalog-content\") pod \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.164350 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r9nw\" (UniqueName: \"kubernetes.io/projected/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-kube-api-access-7r9nw\") pod \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\" (UID: \"fa6b1b66-95ad-45a2-a0ea-ab15dde11143\") " Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.165919 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-utilities" (OuterVolumeSpecName: "utilities") pod "fa6b1b66-95ad-45a2-a0ea-ab15dde11143" (UID: "fa6b1b66-95ad-45a2-a0ea-ab15dde11143"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.170876 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-kube-api-access-7r9nw" (OuterVolumeSpecName: "kube-api-access-7r9nw") pod "fa6b1b66-95ad-45a2-a0ea-ab15dde11143" (UID: "fa6b1b66-95ad-45a2-a0ea-ab15dde11143"). InnerVolumeSpecName "kube-api-access-7r9nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.266465 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.266496 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r9nw\" (UniqueName: \"kubernetes.io/projected/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-kube-api-access-7r9nw\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.338991 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" path="/var/lib/kubelet/pods/9d035d7e-48d7-4662-b62e-2ae23d54c8db/volumes" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.627976 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerID="4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24" exitCode=0 Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.628012 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerDied","Data":"4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24"} Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.628020 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjfnk" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.628803 4669 scope.go:117] "RemoveContainer" containerID="4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.629064 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjfnk" event={"ID":"fa6b1b66-95ad-45a2-a0ea-ab15dde11143","Type":"ContainerDied","Data":"baae500655bcffbb1b7f50c128cfc1883868f6fdb784840c23f0e6f6be23acd1"} Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.632388 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" event={"ID":"a6266c66-d403-46e9-ad99-54beedd6adf4","Type":"ContainerStarted","Data":"217284afd758d581b091c85d9d06af25f3d379dcffa70a641558e498203970e0"} Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.661811 4669 scope.go:117] "RemoveContainer" containerID="6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.684399 4669 scope.go:117] "RemoveContainer" containerID="ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.727588 4669 scope.go:117] "RemoveContainer" containerID="4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24" Oct 08 20:57:39 crc kubenswrapper[4669]: E1008 20:57:39.728126 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24\": container with ID starting with 4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24 not found: ID does not exist" containerID="4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.728187 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24"} err="failed to get container status \"4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24\": rpc error: code = NotFound desc = could not find container \"4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24\": container with ID starting with 4fc88075add89b7b1f9cec84aa48f32baccd75cb8c88a533c340b69bdd195a24 not found: ID does not exist" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.728227 4669 scope.go:117] "RemoveContainer" containerID="6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236" Oct 08 20:57:39 crc kubenswrapper[4669]: E1008 20:57:39.728757 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236\": container with ID starting with 6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236 not found: ID does not exist" containerID="6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.728798 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236"} err="failed to get container status \"6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236\": rpc error: code = NotFound desc = could not find container \"6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236\": container with ID starting with 6b1461e3ff5dce4fd6e6103c9b08e8db53279560f8bbd38409ab752f5e4ed236 not found: ID does not exist" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.728823 4669 scope.go:117] "RemoveContainer" containerID="ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c" Oct 08 20:57:39 crc kubenswrapper[4669]: E1008 20:57:39.729106 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c\": container with ID starting with ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c not found: ID does not exist" containerID="ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c" Oct 08 20:57:39 crc kubenswrapper[4669]: I1008 20:57:39.729154 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c"} err="failed to get container status \"ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c\": rpc error: code = NotFound desc = could not find container \"ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c\": container with ID starting with ede1a814402bde7cb7b670f0c322f299963e022ec3dd9a446ce0713453cbdb0c not found: ID does not exist" Oct 08 20:57:40 crc kubenswrapper[4669]: I1008 20:57:40.586041 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa6b1b66-95ad-45a2-a0ea-ab15dde11143" (UID: "fa6b1b66-95ad-45a2-a0ea-ab15dde11143"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:57:40 crc kubenswrapper[4669]: I1008 20:57:40.641578 4669 generic.go:334] "Generic (PLEG): container finished" podID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerID="217284afd758d581b091c85d9d06af25f3d379dcffa70a641558e498203970e0" exitCode=0 Oct 08 20:57:40 crc kubenswrapper[4669]: I1008 20:57:40.641623 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" event={"ID":"a6266c66-d403-46e9-ad99-54beedd6adf4","Type":"ContainerDied","Data":"217284afd758d581b091c85d9d06af25f3d379dcffa70a641558e498203970e0"} Oct 08 20:57:40 crc kubenswrapper[4669]: E1008 20:57:40.652279 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6266c66_d403_46e9_ad99_54beedd6adf4.slice/crio-conmon-217284afd758d581b091c85d9d06af25f3d379dcffa70a641558e498203970e0.scope\": RecentStats: unable to find data in memory cache]" Oct 08 20:57:40 crc kubenswrapper[4669]: I1008 20:57:40.686453 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6b1b66-95ad-45a2-a0ea-ab15dde11143-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:40 crc kubenswrapper[4669]: I1008 20:57:40.859411 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjfnk"] Oct 08 20:57:40 crc kubenswrapper[4669]: I1008 20:57:40.865281 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jjfnk"] Oct 08 20:57:41 crc kubenswrapper[4669]: I1008 20:57:41.347701 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" path="/var/lib/kubelet/pods/fa6b1b66-95ad-45a2-a0ea-ab15dde11143/volumes" Oct 08 20:57:41 crc kubenswrapper[4669]: I1008 20:57:41.653333 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" event={"ID":"a6266c66-d403-46e9-ad99-54beedd6adf4","Type":"ContainerDied","Data":"5a8202b874c25e7bb1ac6dc5806bccada6b4e3d83583a155b5c5d0075dcc092c"} Oct 08 20:57:41 crc kubenswrapper[4669]: I1008 20:57:41.653172 4669 generic.go:334] "Generic (PLEG): container finished" podID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerID="5a8202b874c25e7bb1ac6dc5806bccada6b4e3d83583a155b5c5d0075dcc092c" exitCode=0 Oct 08 20:57:42 crc kubenswrapper[4669]: I1008 20:57:42.964795 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.119800 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-bundle\") pod \"a6266c66-d403-46e9-ad99-54beedd6adf4\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.119882 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-util\") pod \"a6266c66-d403-46e9-ad99-54beedd6adf4\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.119999 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4pg5\" (UniqueName: \"kubernetes.io/projected/a6266c66-d403-46e9-ad99-54beedd6adf4-kube-api-access-h4pg5\") pod \"a6266c66-d403-46e9-ad99-54beedd6adf4\" (UID: \"a6266c66-d403-46e9-ad99-54beedd6adf4\") " Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.121211 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-bundle" (OuterVolumeSpecName: "bundle") pod "a6266c66-d403-46e9-ad99-54beedd6adf4" (UID: "a6266c66-d403-46e9-ad99-54beedd6adf4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.127468 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6266c66-d403-46e9-ad99-54beedd6adf4-kube-api-access-h4pg5" (OuterVolumeSpecName: "kube-api-access-h4pg5") pod "a6266c66-d403-46e9-ad99-54beedd6adf4" (UID: "a6266c66-d403-46e9-ad99-54beedd6adf4"). InnerVolumeSpecName "kube-api-access-h4pg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.139411 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-util" (OuterVolumeSpecName: "util") pod "a6266c66-d403-46e9-ad99-54beedd6adf4" (UID: "a6266c66-d403-46e9-ad99-54beedd6adf4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.221460 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4pg5\" (UniqueName: \"kubernetes.io/projected/a6266c66-d403-46e9-ad99-54beedd6adf4-kube-api-access-h4pg5\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.221510 4669 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.221558 4669 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6266c66-d403-46e9-ad99-54beedd6adf4-util\") on node \"crc\" DevicePath \"\"" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.669927 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" event={"ID":"a6266c66-d403-46e9-ad99-54beedd6adf4","Type":"ContainerDied","Data":"45f47d34b2d9b41509ed0af392394757bb3d11cdd6d7e1ab69d078ebfa1f2920"} Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.670306 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45f47d34b2d9b41509ed0af392394757bb3d11cdd6d7e1ab69d078ebfa1f2920" Oct 08 20:57:43 crc kubenswrapper[4669]: I1008 20:57:43.670051 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230275 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cjdtc"] Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230479 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="pull" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230490 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="pull" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230498 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="extract-utilities" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230505 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="extract-utilities" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230517 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="util" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230522 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="util" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230558 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="registry-server" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230564 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="registry-server" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230572 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="extract-utilities" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230577 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="extract-utilities" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230586 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="extract-content" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230592 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="extract-content" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230601 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="extract" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230607 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="extract" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230615 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="registry-server" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230621 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="registry-server" Oct 08 20:57:44 crc kubenswrapper[4669]: E1008 20:57:44.230630 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="extract-content" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230635 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="extract-content" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230728 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6266c66-d403-46e9-ad99-54beedd6adf4" containerName="extract" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230745 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6b1b66-95ad-45a2-a0ea-ab15dde11143" containerName="registry-server" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.230754 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d035d7e-48d7-4662-b62e-2ae23d54c8db" containerName="registry-server" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.232258 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.255329 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjdtc"] Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.341288 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-catalog-content\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.341550 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-utilities\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.341716 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ld5\" (UniqueName: \"kubernetes.io/projected/71ffae52-8084-47e5-a64d-6fafd917b52e-kube-api-access-z8ld5\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.443448 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-utilities\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.443582 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ld5\" (UniqueName: \"kubernetes.io/projected/71ffae52-8084-47e5-a64d-6fafd917b52e-kube-api-access-z8ld5\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.443658 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-catalog-content\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.445122 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-utilities\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.446274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-catalog-content\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.471868 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ld5\" (UniqueName: \"kubernetes.io/projected/71ffae52-8084-47e5-a64d-6fafd917b52e-kube-api-access-z8ld5\") pod \"certified-operators-cjdtc\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:44 crc kubenswrapper[4669]: I1008 20:57:44.552660 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:45 crc kubenswrapper[4669]: I1008 20:57:45.011669 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjdtc"] Oct 08 20:57:45 crc kubenswrapper[4669]: W1008 20:57:45.017756 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ffae52_8084_47e5_a64d_6fafd917b52e.slice/crio-2e8bede3259aae9fc676bb67e745f8041c87ddbdf3a91989fcd5dda036473666 WatchSource:0}: Error finding container 2e8bede3259aae9fc676bb67e745f8041c87ddbdf3a91989fcd5dda036473666: Status 404 returned error can't find the container with id 2e8bede3259aae9fc676bb67e745f8041c87ddbdf3a91989fcd5dda036473666 Oct 08 20:57:45 crc kubenswrapper[4669]: I1008 20:57:45.685469 4669 generic.go:334] "Generic (PLEG): container finished" podID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerID="08dbd662da4e04052e119282fdbd313ed380f84f31756a01cc8cb9c65fbeb8ba" exitCode=0 Oct 08 20:57:45 crc kubenswrapper[4669]: I1008 20:57:45.685572 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerDied","Data":"08dbd662da4e04052e119282fdbd313ed380f84f31756a01cc8cb9c65fbeb8ba"} Oct 08 20:57:45 crc kubenswrapper[4669]: I1008 20:57:45.685936 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerStarted","Data":"2e8bede3259aae9fc676bb67e745f8041c87ddbdf3a91989fcd5dda036473666"} Oct 08 20:57:46 crc kubenswrapper[4669]: I1008 20:57:46.693984 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerStarted","Data":"767d4e5c0f2721391109ef9e4acb16dbda4e3147489b8c77bf1d62dce6457daa"} Oct 08 20:57:47 crc kubenswrapper[4669]: I1008 20:57:47.701661 4669 generic.go:334] "Generic (PLEG): container finished" podID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerID="767d4e5c0f2721391109ef9e4acb16dbda4e3147489b8c77bf1d62dce6457daa" exitCode=0 Oct 08 20:57:47 crc kubenswrapper[4669]: I1008 20:57:47.701733 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerDied","Data":"767d4e5c0f2721391109ef9e4acb16dbda4e3147489b8c77bf1d62dce6457daa"} Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.710476 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerStarted","Data":"461a4fe93db67e6e9b921af70b0ea3d407b1428ac14458fcce565d05bebba845"} Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.729656 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cjdtc" podStartSLOduration=2.19921228 podStartE2EDuration="4.729641139s" podCreationTimestamp="2025-10-08 20:57:44 +0000 UTC" firstStartedPulling="2025-10-08 20:57:45.689844612 +0000 UTC m=+785.382655315" lastFinishedPulling="2025-10-08 20:57:48.220273471 +0000 UTC m=+787.913084174" observedRunningTime="2025-10-08 20:57:48.72676341 +0000 UTC m=+788.419574083" watchObservedRunningTime="2025-10-08 20:57:48.729641139 +0000 UTC m=+788.422451812" Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.803387 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr"] Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.804474 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.807490 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-4csfz" Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.909110 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jttw5\" (UniqueName: \"kubernetes.io/projected/6f26e0f8-20e5-4f7f-8e25-c17579e57b28-kube-api-access-jttw5\") pod \"openstack-operator-controller-operator-5bb56b84bf-j6scr\" (UID: \"6f26e0f8-20e5-4f7f-8e25-c17579e57b28\") " pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:48 crc kubenswrapper[4669]: I1008 20:57:48.942339 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr"] Oct 08 20:57:49 crc kubenswrapper[4669]: I1008 20:57:49.010565 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jttw5\" (UniqueName: \"kubernetes.io/projected/6f26e0f8-20e5-4f7f-8e25-c17579e57b28-kube-api-access-jttw5\") pod \"openstack-operator-controller-operator-5bb56b84bf-j6scr\" (UID: \"6f26e0f8-20e5-4f7f-8e25-c17579e57b28\") " pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:49 crc kubenswrapper[4669]: I1008 20:57:49.030261 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jttw5\" (UniqueName: \"kubernetes.io/projected/6f26e0f8-20e5-4f7f-8e25-c17579e57b28-kube-api-access-jttw5\") pod \"openstack-operator-controller-operator-5bb56b84bf-j6scr\" (UID: \"6f26e0f8-20e5-4f7f-8e25-c17579e57b28\") " pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:49 crc kubenswrapper[4669]: I1008 20:57:49.119288 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:49 crc kubenswrapper[4669]: I1008 20:57:49.556061 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr"] Oct 08 20:57:49 crc kubenswrapper[4669]: W1008 20:57:49.560227 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f26e0f8_20e5_4f7f_8e25_c17579e57b28.slice/crio-539f585e068c459811a693da984622ada3a42df5ab02ae41919dc3211019297e WatchSource:0}: Error finding container 539f585e068c459811a693da984622ada3a42df5ab02ae41919dc3211019297e: Status 404 returned error can't find the container with id 539f585e068c459811a693da984622ada3a42df5ab02ae41919dc3211019297e Oct 08 20:57:49 crc kubenswrapper[4669]: I1008 20:57:49.716484 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" event={"ID":"6f26e0f8-20e5-4f7f-8e25-c17579e57b28","Type":"ContainerStarted","Data":"539f585e068c459811a693da984622ada3a42df5ab02ae41919dc3211019297e"} Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.031803 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-586mf"] Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.040602 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.052634 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-586mf"] Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.142934 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-catalog-content\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.142976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-utilities\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.142996 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgx4d\" (UniqueName: \"kubernetes.io/projected/3c3179d3-fd24-4f57-a9c4-377948201a24-kube-api-access-fgx4d\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.243913 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-catalog-content\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.243950 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-utilities\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.243977 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgx4d\" (UniqueName: \"kubernetes.io/projected/3c3179d3-fd24-4f57-a9c4-377948201a24-kube-api-access-fgx4d\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.244680 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-catalog-content\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.244886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-utilities\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.264990 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgx4d\" (UniqueName: \"kubernetes.io/projected/3c3179d3-fd24-4f57-a9c4-377948201a24-kube-api-access-fgx4d\") pod \"redhat-marketplace-586mf\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:51 crc kubenswrapper[4669]: I1008 20:57:51.371835 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:57:53 crc kubenswrapper[4669]: I1008 20:57:53.557574 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-586mf"] Oct 08 20:57:53 crc kubenswrapper[4669]: W1008 20:57:53.569041 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3179d3_fd24_4f57_a9c4_377948201a24.slice/crio-49780c6fdb32670c699299f08f167f4c1e2675287ee7e1b1536d8d5edfddeaee WatchSource:0}: Error finding container 49780c6fdb32670c699299f08f167f4c1e2675287ee7e1b1536d8d5edfddeaee: Status 404 returned error can't find the container with id 49780c6fdb32670c699299f08f167f4c1e2675287ee7e1b1536d8d5edfddeaee Oct 08 20:57:53 crc kubenswrapper[4669]: I1008 20:57:53.745946 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" event={"ID":"6f26e0f8-20e5-4f7f-8e25-c17579e57b28","Type":"ContainerStarted","Data":"d640ef8ebbad323556a9bc677d6488014287dc90e2c3bc7ceda623da0eb563da"} Oct 08 20:57:53 crc kubenswrapper[4669]: I1008 20:57:53.748822 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerStarted","Data":"0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238"} Oct 08 20:57:53 crc kubenswrapper[4669]: I1008 20:57:53.748849 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerStarted","Data":"49780c6fdb32670c699299f08f167f4c1e2675287ee7e1b1536d8d5edfddeaee"} Oct 08 20:57:54 crc kubenswrapper[4669]: I1008 20:57:54.553139 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:54 crc kubenswrapper[4669]: I1008 20:57:54.553233 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:54 crc kubenswrapper[4669]: I1008 20:57:54.597049 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:54 crc kubenswrapper[4669]: I1008 20:57:54.756787 4669 generic.go:334] "Generic (PLEG): container finished" podID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerID="0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238" exitCode=0 Oct 08 20:57:54 crc kubenswrapper[4669]: I1008 20:57:54.756857 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerDied","Data":"0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238"} Oct 08 20:57:54 crc kubenswrapper[4669]: I1008 20:57:54.821862 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:57:55 crc kubenswrapper[4669]: I1008 20:57:55.765462 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" event={"ID":"6f26e0f8-20e5-4f7f-8e25-c17579e57b28","Type":"ContainerStarted","Data":"fe3937ff970ca561cafc045156c1a50a8a52a8e6dab3d957237307ca01460440"} Oct 08 20:57:55 crc kubenswrapper[4669]: I1008 20:57:55.765571 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:55 crc kubenswrapper[4669]: I1008 20:57:55.809984 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" podStartSLOduration=1.997574623 podStartE2EDuration="7.809967975s" podCreationTimestamp="2025-10-08 20:57:48 +0000 UTC" firstStartedPulling="2025-10-08 20:57:49.563099464 +0000 UTC m=+789.255910137" lastFinishedPulling="2025-10-08 20:57:55.375492806 +0000 UTC m=+795.068303489" observedRunningTime="2025-10-08 20:57:55.806860839 +0000 UTC m=+795.499671512" watchObservedRunningTime="2025-10-08 20:57:55.809967975 +0000 UTC m=+795.502778638" Oct 08 20:57:56 crc kubenswrapper[4669]: I1008 20:57:56.773642 4669 generic.go:334] "Generic (PLEG): container finished" podID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerID="e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad" exitCode=0 Oct 08 20:57:56 crc kubenswrapper[4669]: I1008 20:57:56.773724 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerDied","Data":"e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad"} Oct 08 20:57:57 crc kubenswrapper[4669]: I1008 20:57:57.784966 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerStarted","Data":"16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c"} Oct 08 20:57:59 crc kubenswrapper[4669]: I1008 20:57:59.123445 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5bb56b84bf-j6scr" Oct 08 20:57:59 crc kubenswrapper[4669]: I1008 20:57:59.154766 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-586mf" podStartSLOduration=5.738934153 podStartE2EDuration="8.154749883s" podCreationTimestamp="2025-10-08 20:57:51 +0000 UTC" firstStartedPulling="2025-10-08 20:57:54.775739513 +0000 UTC m=+794.468550186" lastFinishedPulling="2025-10-08 20:57:57.191555243 +0000 UTC m=+796.884365916" observedRunningTime="2025-10-08 20:57:57.806888432 +0000 UTC m=+797.499699115" watchObservedRunningTime="2025-10-08 20:57:59.154749883 +0000 UTC m=+798.847560556" Oct 08 20:58:00 crc kubenswrapper[4669]: I1008 20:58:00.624467 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjdtc"] Oct 08 20:58:00 crc kubenswrapper[4669]: I1008 20:58:00.624900 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cjdtc" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="registry-server" containerID="cri-o://461a4fe93db67e6e9b921af70b0ea3d407b1428ac14458fcce565d05bebba845" gracePeriod=2 Oct 08 20:58:00 crc kubenswrapper[4669]: I1008 20:58:00.818851 4669 generic.go:334] "Generic (PLEG): container finished" podID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerID="461a4fe93db67e6e9b921af70b0ea3d407b1428ac14458fcce565d05bebba845" exitCode=0 Oct 08 20:58:00 crc kubenswrapper[4669]: I1008 20:58:00.818929 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerDied","Data":"461a4fe93db67e6e9b921af70b0ea3d407b1428ac14458fcce565d05bebba845"} Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.105226 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.184803 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8ld5\" (UniqueName: \"kubernetes.io/projected/71ffae52-8084-47e5-a64d-6fafd917b52e-kube-api-access-z8ld5\") pod \"71ffae52-8084-47e5-a64d-6fafd917b52e\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.185097 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-catalog-content\") pod \"71ffae52-8084-47e5-a64d-6fafd917b52e\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.185211 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-utilities\") pod \"71ffae52-8084-47e5-a64d-6fafd917b52e\" (UID: \"71ffae52-8084-47e5-a64d-6fafd917b52e\") " Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.185969 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-utilities" (OuterVolumeSpecName: "utilities") pod "71ffae52-8084-47e5-a64d-6fafd917b52e" (UID: "71ffae52-8084-47e5-a64d-6fafd917b52e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.190278 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ffae52-8084-47e5-a64d-6fafd917b52e-kube-api-access-z8ld5" (OuterVolumeSpecName: "kube-api-access-z8ld5") pod "71ffae52-8084-47e5-a64d-6fafd917b52e" (UID: "71ffae52-8084-47e5-a64d-6fafd917b52e"). InnerVolumeSpecName "kube-api-access-z8ld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.229517 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71ffae52-8084-47e5-a64d-6fafd917b52e" (UID: "71ffae52-8084-47e5-a64d-6fafd917b52e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.287111 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.287196 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ffae52-8084-47e5-a64d-6fafd917b52e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.287222 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8ld5\" (UniqueName: \"kubernetes.io/projected/71ffae52-8084-47e5-a64d-6fafd917b52e-kube-api-access-z8ld5\") on node \"crc\" DevicePath \"\"" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.373407 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.373509 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.426676 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.827633 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjdtc" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.827645 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjdtc" event={"ID":"71ffae52-8084-47e5-a64d-6fafd917b52e","Type":"ContainerDied","Data":"2e8bede3259aae9fc676bb67e745f8041c87ddbdf3a91989fcd5dda036473666"} Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.827728 4669 scope.go:117] "RemoveContainer" containerID="461a4fe93db67e6e9b921af70b0ea3d407b1428ac14458fcce565d05bebba845" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.848909 4669 scope.go:117] "RemoveContainer" containerID="767d4e5c0f2721391109ef9e4acb16dbda4e3147489b8c77bf1d62dce6457daa" Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.851834 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjdtc"] Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.856838 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cjdtc"] Oct 08 20:58:01 crc kubenswrapper[4669]: I1008 20:58:01.867719 4669 scope.go:117] "RemoveContainer" containerID="08dbd662da4e04052e119282fdbd313ed380f84f31756a01cc8cb9c65fbeb8ba" Oct 08 20:58:03 crc kubenswrapper[4669]: I1008 20:58:03.341825 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" path="/var/lib/kubelet/pods/71ffae52-8084-47e5-a64d-6fafd917b52e/volumes" Oct 08 20:58:11 crc kubenswrapper[4669]: I1008 20:58:11.463879 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:58:11 crc kubenswrapper[4669]: I1008 20:58:11.539290 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-586mf"] Oct 08 20:58:11 crc kubenswrapper[4669]: I1008 20:58:11.888012 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-586mf" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="registry-server" containerID="cri-o://16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c" gracePeriod=2 Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.799378 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.895810 4669 generic.go:334] "Generic (PLEG): container finished" podID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerID="16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c" exitCode=0 Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.895874 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-586mf" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.895901 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerDied","Data":"16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c"} Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.896178 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-586mf" event={"ID":"3c3179d3-fd24-4f57-a9c4-377948201a24","Type":"ContainerDied","Data":"49780c6fdb32670c699299f08f167f4c1e2675287ee7e1b1536d8d5edfddeaee"} Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.896202 4669 scope.go:117] "RemoveContainer" containerID="16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.912754 4669 scope.go:117] "RemoveContainer" containerID="e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.927357 4669 scope.go:117] "RemoveContainer" containerID="0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.947361 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-utilities\") pod \"3c3179d3-fd24-4f57-a9c4-377948201a24\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.949158 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgx4d\" (UniqueName: \"kubernetes.io/projected/3c3179d3-fd24-4f57-a9c4-377948201a24-kube-api-access-fgx4d\") pod \"3c3179d3-fd24-4f57-a9c4-377948201a24\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.949230 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-catalog-content\") pod \"3c3179d3-fd24-4f57-a9c4-377948201a24\" (UID: \"3c3179d3-fd24-4f57-a9c4-377948201a24\") " Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.948643 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-utilities" (OuterVolumeSpecName: "utilities") pod "3c3179d3-fd24-4f57-a9c4-377948201a24" (UID: "3c3179d3-fd24-4f57-a9c4-377948201a24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.957040 4669 scope.go:117] "RemoveContainer" containerID="16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.962345 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3179d3-fd24-4f57-a9c4-377948201a24-kube-api-access-fgx4d" (OuterVolumeSpecName: "kube-api-access-fgx4d") pod "3c3179d3-fd24-4f57-a9c4-377948201a24" (UID: "3c3179d3-fd24-4f57-a9c4-377948201a24"). InnerVolumeSpecName "kube-api-access-fgx4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.968926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c3179d3-fd24-4f57-a9c4-377948201a24" (UID: "3c3179d3-fd24-4f57-a9c4-377948201a24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:58:12 crc kubenswrapper[4669]: E1008 20:58:12.975837 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c\": container with ID starting with 16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c not found: ID does not exist" containerID="16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.975944 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c"} err="failed to get container status \"16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c\": rpc error: code = NotFound desc = could not find container \"16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c\": container with ID starting with 16777fb614c7a0a2259c0979444cc593531691b11c70abe8e9db2482fdaccf9c not found: ID does not exist" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.975976 4669 scope.go:117] "RemoveContainer" containerID="e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad" Oct 08 20:58:12 crc kubenswrapper[4669]: E1008 20:58:12.976705 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad\": container with ID starting with e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad not found: ID does not exist" containerID="e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.976743 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad"} err="failed to get container status \"e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad\": rpc error: code = NotFound desc = could not find container \"e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad\": container with ID starting with e1a6c981677a7c82bfd547a803c934819712da540e7d89219d7c14c15f6a52ad not found: ID does not exist" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.976759 4669 scope.go:117] "RemoveContainer" containerID="0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238" Oct 08 20:58:12 crc kubenswrapper[4669]: E1008 20:58:12.977218 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238\": container with ID starting with 0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238 not found: ID does not exist" containerID="0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238" Oct 08 20:58:12 crc kubenswrapper[4669]: I1008 20:58:12.977260 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238"} err="failed to get container status \"0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238\": rpc error: code = NotFound desc = could not find container \"0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238\": container with ID starting with 0a69867de897a1c0026e414dd9f4af6fcffd8447847683129fc133a267b7c238 not found: ID does not exist" Oct 08 20:58:13 crc kubenswrapper[4669]: I1008 20:58:13.051512 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:58:13 crc kubenswrapper[4669]: I1008 20:58:13.051563 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgx4d\" (UniqueName: \"kubernetes.io/projected/3c3179d3-fd24-4f57-a9c4-377948201a24-kube-api-access-fgx4d\") on node \"crc\" DevicePath \"\"" Oct 08 20:58:13 crc kubenswrapper[4669]: I1008 20:58:13.051574 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3179d3-fd24-4f57-a9c4-377948201a24-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:58:13 crc kubenswrapper[4669]: I1008 20:58:13.221894 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-586mf"] Oct 08 20:58:13 crc kubenswrapper[4669]: I1008 20:58:13.225144 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-586mf"] Oct 08 20:58:13 crc kubenswrapper[4669]: I1008 20:58:13.337417 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" path="/var/lib/kubelet/pods/3c3179d3-fd24-4f57-a9c4-377948201a24/volumes" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.360347 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb"] Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.362126 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="extract-content" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.362230 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="extract-content" Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.362313 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="extract-utilities" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.362385 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="extract-utilities" Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.362461 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="extract-utilities" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.362561 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="extract-utilities" Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.362659 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="registry-server" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.362739 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="registry-server" Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.362813 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="registry-server" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.362882 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="registry-server" Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.362971 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="extract-content" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.363052 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="extract-content" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.363357 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ffae52-8084-47e5-a64d-6fafd917b52e" containerName="registry-server" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.363460 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3179d3-fd24-4f57-a9c4-377948201a24" containerName="registry-server" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.364271 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.370058 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5q9vs" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.373695 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.379846 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.381012 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.389237 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2hcvp" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.400159 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.401109 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.403680 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-f5rjw" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.411645 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.412423 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.415719 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hpdbp" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.423420 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.436238 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.453086 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.459385 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.459629 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.460699 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.465304 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rgt5l" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.465762 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vf6t8" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.488734 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfb48\" (UniqueName: \"kubernetes.io/projected/b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45-kube-api-access-hfb48\") pod \"cinder-operator-controller-manager-59cdc64769-6sdlr\" (UID: \"b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.488890 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6msh\" (UniqueName: \"kubernetes.io/projected/5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5-kube-api-access-x6msh\") pod \"barbican-operator-controller-manager-64f84fcdbb-rtmfb\" (UID: \"5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.489052 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n96p5\" (UniqueName: \"kubernetes.io/projected/5f0a0ada-acd3-452b-bd50-b5d634b906c4-kube-api-access-n96p5\") pod \"designate-operator-controller-manager-687df44cdb-m9v76\" (UID: \"5f0a0ada-acd3-452b-bd50-b5d634b906c4\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.489377 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.489313 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjhn\" (UniqueName: \"kubernetes.io/projected/dc0d4c88-6c32-4498-8025-de3c8b59eaea-kube-api-access-svjhn\") pod \"glance-operator-controller-manager-7bb46cd7d-pjtc2\" (UID: \"dc0d4c88-6c32-4498-8025-de3c8b59eaea\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.516660 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.562140 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.564288 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.574408 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tpnzd" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.574564 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.592647 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfb48\" (UniqueName: \"kubernetes.io/projected/b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45-kube-api-access-hfb48\") pod \"cinder-operator-controller-manager-59cdc64769-6sdlr\" (UID: \"b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.592690 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj4hd\" (UniqueName: \"kubernetes.io/projected/0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a-kube-api-access-rj4hd\") pod \"horizon-operator-controller-manager-6d74794d9b-mpbbj\" (UID: \"0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.592712 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6msh\" (UniqueName: \"kubernetes.io/projected/5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5-kube-api-access-x6msh\") pod \"barbican-operator-controller-manager-64f84fcdbb-rtmfb\" (UID: \"5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.592743 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gddzp\" (UniqueName: \"kubernetes.io/projected/7185d03c-648b-488d-b1d0-842f8b72e0ff-kube-api-access-gddzp\") pod \"heat-operator-controller-manager-6d9967f8dd-w2mfj\" (UID: \"7185d03c-648b-488d-b1d0-842f8b72e0ff\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.592765 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n96p5\" (UniqueName: \"kubernetes.io/projected/5f0a0ada-acd3-452b-bd50-b5d634b906c4-kube-api-access-n96p5\") pod \"designate-operator-controller-manager-687df44cdb-m9v76\" (UID: \"5f0a0ada-acd3-452b-bd50-b5d634b906c4\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.592796 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svjhn\" (UniqueName: \"kubernetes.io/projected/dc0d4c88-6c32-4498-8025-de3c8b59eaea-kube-api-access-svjhn\") pod \"glance-operator-controller-manager-7bb46cd7d-pjtc2\" (UID: \"dc0d4c88-6c32-4498-8025-de3c8b59eaea\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.611736 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.628879 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6msh\" (UniqueName: \"kubernetes.io/projected/5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5-kube-api-access-x6msh\") pod \"barbican-operator-controller-manager-64f84fcdbb-rtmfb\" (UID: \"5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.629420 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.630145 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjhn\" (UniqueName: \"kubernetes.io/projected/dc0d4c88-6c32-4498-8025-de3c8b59eaea-kube-api-access-svjhn\") pod \"glance-operator-controller-manager-7bb46cd7d-pjtc2\" (UID: \"dc0d4c88-6c32-4498-8025-de3c8b59eaea\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.633145 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfb48\" (UniqueName: \"kubernetes.io/projected/b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45-kube-api-access-hfb48\") pod \"cinder-operator-controller-manager-59cdc64769-6sdlr\" (UID: \"b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.649613 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.650647 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.655618 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.656756 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.661039 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qbg2d" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.671037 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.684794 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.687720 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-879fk" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.694067 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gddzp\" (UniqueName: \"kubernetes.io/projected/7185d03c-648b-488d-b1d0-842f8b72e0ff-kube-api-access-gddzp\") pod \"heat-operator-controller-manager-6d9967f8dd-w2mfj\" (UID: \"7185d03c-648b-488d-b1d0-842f8b72e0ff\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.694152 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-cert\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.694187 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whqdf\" (UniqueName: \"kubernetes.io/projected/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-kube-api-access-whqdf\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.694211 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj4hd\" (UniqueName: \"kubernetes.io/projected/0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a-kube-api-access-rj4hd\") pod \"horizon-operator-controller-manager-6d74794d9b-mpbbj\" (UID: \"0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.697596 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.703188 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n96p5\" (UniqueName: \"kubernetes.io/projected/5f0a0ada-acd3-452b-bd50-b5d634b906c4-kube-api-access-n96p5\") pod \"designate-operator-controller-manager-687df44cdb-m9v76\" (UID: \"5f0a0ada-acd3-452b-bd50-b5d634b906c4\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.703846 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.709053 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-frtrx"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.718956 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.722122 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.723112 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.723659 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.729892 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nrk2w" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.730267 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8p8fj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.748585 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.749106 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gddzp\" (UniqueName: \"kubernetes.io/projected/7185d03c-648b-488d-b1d0-842f8b72e0ff-kube-api-access-gddzp\") pod \"heat-operator-controller-manager-6d9967f8dd-w2mfj\" (UID: \"7185d03c-648b-488d-b1d0-842f8b72e0ff\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.757203 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj4hd\" (UniqueName: \"kubernetes.io/projected/0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a-kube-api-access-rj4hd\") pod \"horizon-operator-controller-manager-6d74794d9b-mpbbj\" (UID: \"0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.787785 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-frtrx"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.794830 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzzp\" (UniqueName: \"kubernetes.io/projected/f92c3530-f73f-45fe-84f5-bea451e1aaba-kube-api-access-hxzzp\") pod \"ironic-operator-controller-manager-74cb5cbc49-hwmh2\" (UID: \"f92c3530-f73f-45fe-84f5-bea451e1aaba\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.794873 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whqdf\" (UniqueName: \"kubernetes.io/projected/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-kube-api-access-whqdf\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.794924 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbcz\" (UniqueName: \"kubernetes.io/projected/1af3f5fc-da55-4c9e-ac87-30363f6cf741-kube-api-access-xmbcz\") pod \"mariadb-operator-controller-manager-5777b4f897-5rs85\" (UID: \"1af3f5fc-da55-4c9e-ac87-30363f6cf741\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.795014 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-cert\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.795046 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfqmx\" (UniqueName: \"kubernetes.io/projected/4ee291fa-b998-4bfc-a689-fb66e345bcaa-kube-api-access-rfqmx\") pod \"manila-operator-controller-manager-59578bc799-frtrx\" (UID: \"4ee291fa-b998-4bfc-a689-fb66e345bcaa\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.795071 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w98f\" (UniqueName: \"kubernetes.io/projected/b43bc083-637e-4c93-a024-a47cecaade29-kube-api-access-7w98f\") pod \"keystone-operator-controller-manager-ddb98f99b-6mn5l\" (UID: \"b43bc083-637e-4c93-a024-a47cecaade29\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.795273 4669 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 20:58:15 crc kubenswrapper[4669]: E1008 20:58:15.795322 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-cert podName:01d6ae8d-9f65-4a30-80fb-135e4eba5a10 nodeName:}" failed. No retries permitted until 2025-10-08 20:58:16.295303767 +0000 UTC m=+815.988114440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-cert") pod "infra-operator-controller-manager-585fc5b659-c2w55" (UID: "01d6ae8d-9f65-4a30-80fb-135e4eba5a10") : secret "infra-operator-webhook-server-cert" not found Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.802576 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.810271 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.824689 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.825544 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.826186 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.826519 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.828161 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rgkcv" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.828840 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8xjp8" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.832359 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whqdf\" (UniqueName: \"kubernetes.io/projected/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-kube-api-access-whqdf\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.840732 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.858992 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.868769 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.897391 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfqmx\" (UniqueName: \"kubernetes.io/projected/4ee291fa-b998-4bfc-a689-fb66e345bcaa-kube-api-access-rfqmx\") pod \"manila-operator-controller-manager-59578bc799-frtrx\" (UID: \"4ee291fa-b998-4bfc-a689-fb66e345bcaa\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.897430 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w98f\" (UniqueName: \"kubernetes.io/projected/b43bc083-637e-4c93-a024-a47cecaade29-kube-api-access-7w98f\") pod \"keystone-operator-controller-manager-ddb98f99b-6mn5l\" (UID: \"b43bc083-637e-4c93-a024-a47cecaade29\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.897451 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzzp\" (UniqueName: \"kubernetes.io/projected/f92c3530-f73f-45fe-84f5-bea451e1aaba-kube-api-access-hxzzp\") pod \"ironic-operator-controller-manager-74cb5cbc49-hwmh2\" (UID: \"f92c3530-f73f-45fe-84f5-bea451e1aaba\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.897485 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48zr\" (UniqueName: \"kubernetes.io/projected/5fceefe8-bf0f-4f2d-9e14-2208f38b73d7-kube-api-access-g48zr\") pod \"nova-operator-controller-manager-57bb74c7bf-pb4g2\" (UID: \"5fceefe8-bf0f-4f2d-9e14-2208f38b73d7\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.897507 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxvj\" (UniqueName: \"kubernetes.io/projected/a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4-kube-api-access-mnxvj\") pod \"neutron-operator-controller-manager-797d478b46-vl7tc\" (UID: \"a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.897545 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbcz\" (UniqueName: \"kubernetes.io/projected/1af3f5fc-da55-4c9e-ac87-30363f6cf741-kube-api-access-xmbcz\") pod \"mariadb-operator-controller-manager-5777b4f897-5rs85\" (UID: \"1af3f5fc-da55-4c9e-ac87-30363f6cf741\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.898288 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.900803 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.915685 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-7z7d6" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.929307 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.943458 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.944581 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.945861 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfqmx\" (UniqueName: \"kubernetes.io/projected/4ee291fa-b998-4bfc-a689-fb66e345bcaa-kube-api-access-rfqmx\") pod \"manila-operator-controller-manager-59578bc799-frtrx\" (UID: \"4ee291fa-b998-4bfc-a689-fb66e345bcaa\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.954020 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vq4tp" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.967172 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w98f\" (UniqueName: \"kubernetes.io/projected/b43bc083-637e-4c93-a024-a47cecaade29-kube-api-access-7w98f\") pod \"keystone-operator-controller-manager-ddb98f99b-6mn5l\" (UID: \"b43bc083-637e-4c93-a024-a47cecaade29\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.967271 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzzp\" (UniqueName: \"kubernetes.io/projected/f92c3530-f73f-45fe-84f5-bea451e1aaba-kube-api-access-hxzzp\") pod \"ironic-operator-controller-manager-74cb5cbc49-hwmh2\" (UID: \"f92c3530-f73f-45fe-84f5-bea451e1aaba\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.971090 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8"] Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.972415 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.983009 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ls8sj" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.985801 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 08 20:58:15 crc kubenswrapper[4669]: I1008 20:58:15.992435 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.005763 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7g5f\" (UniqueName: \"kubernetes.io/projected/d4ee300d-b78b-4052-83a9-4ab8ca569886-kube-api-access-g7g5f\") pod \"octavia-operator-controller-manager-6d7c7ddf95-9vh5w\" (UID: \"d4ee300d-b78b-4052-83a9-4ab8ca569886\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.005813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48zr\" (UniqueName: \"kubernetes.io/projected/5fceefe8-bf0f-4f2d-9e14-2208f38b73d7-kube-api-access-g48zr\") pod \"nova-operator-controller-manager-57bb74c7bf-pb4g2\" (UID: \"5fceefe8-bf0f-4f2d-9e14-2208f38b73d7\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.005845 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxvj\" (UniqueName: \"kubernetes.io/projected/a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4-kube-api-access-mnxvj\") pod \"neutron-operator-controller-manager-797d478b46-vl7tc\" (UID: \"a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.006057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctpwx\" (UniqueName: \"kubernetes.io/projected/92d7bcb6-b09e-4605-87a9-9cdaedb40c74-kube-api-access-ctpwx\") pod \"ovn-operator-controller-manager-6f96f8c84-gc5qn\" (UID: \"92d7bcb6-b09e-4605-87a9-9cdaedb40c74\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.026145 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.028255 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbcz\" (UniqueName: \"kubernetes.io/projected/1af3f5fc-da55-4c9e-ac87-30363f6cf741-kube-api-access-xmbcz\") pod \"mariadb-operator-controller-manager-5777b4f897-5rs85\" (UID: \"1af3f5fc-da55-4c9e-ac87-30363f6cf741\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.033683 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxvj\" (UniqueName: \"kubernetes.io/projected/a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4-kube-api-access-mnxvj\") pod \"neutron-operator-controller-manager-797d478b46-vl7tc\" (UID: \"a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.034541 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.035543 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.038910 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gzft6" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.053956 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48zr\" (UniqueName: \"kubernetes.io/projected/5fceefe8-bf0f-4f2d-9e14-2208f38b73d7-kube-api-access-g48zr\") pod \"nova-operator-controller-manager-57bb74c7bf-pb4g2\" (UID: \"5fceefe8-bf0f-4f2d-9e14-2208f38b73d7\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.063759 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.064140 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.065251 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.072488 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4gr2j" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.107248 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxgr\" (UniqueName: \"kubernetes.io/projected/d4193662-5c4e-40fa-ac9e-495509e75c4a-kube-api-access-4lxgr\") pod \"placement-operator-controller-manager-664664cb68-bjfjf\" (UID: \"d4193662-5c4e-40fa-ac9e-495509e75c4a\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.107366 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.107397 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctpwx\" (UniqueName: \"kubernetes.io/projected/92d7bcb6-b09e-4605-87a9-9cdaedb40c74-kube-api-access-ctpwx\") pod \"ovn-operator-controller-manager-6f96f8c84-gc5qn\" (UID: \"92d7bcb6-b09e-4605-87a9-9cdaedb40c74\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.107431 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6bc\" (UniqueName: \"kubernetes.io/projected/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-kube-api-access-8m6bc\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.107461 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7g5f\" (UniqueName: \"kubernetes.io/projected/d4ee300d-b78b-4052-83a9-4ab8ca569886-kube-api-access-g7g5f\") pod \"octavia-operator-controller-manager-6d7c7ddf95-9vh5w\" (UID: \"d4ee300d-b78b-4052-83a9-4ab8ca569886\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.107495 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbcd\" (UniqueName: \"kubernetes.io/projected/46a0519d-251f-48e0-9d65-4ca08c627195-kube-api-access-kzbcd\") pod \"swift-operator-controller-manager-5f4d5dfdc6-m6ldz\" (UID: \"46a0519d-251f-48e0-9d65-4ca08c627195\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.113457 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.119562 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.131144 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.151672 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7g5f\" (UniqueName: \"kubernetes.io/projected/d4ee300d-b78b-4052-83a9-4ab8ca569886-kube-api-access-g7g5f\") pod \"octavia-operator-controller-manager-6d7c7ddf95-9vh5w\" (UID: \"d4ee300d-b78b-4052-83a9-4ab8ca569886\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.160388 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.184891 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-krjxd" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.186809 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctpwx\" (UniqueName: \"kubernetes.io/projected/92d7bcb6-b09e-4605-87a9-9cdaedb40c74-kube-api-access-ctpwx\") pod \"ovn-operator-controller-manager-6f96f8c84-gc5qn\" (UID: \"92d7bcb6-b09e-4605-87a9-9cdaedb40c74\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.196779 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.201888 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.202211 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.205826 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.213423 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9f5m\" (UniqueName: \"kubernetes.io/projected/58991c0e-b29b-4851-b5b0-8327380e1320-kube-api-access-p9f5m\") pod \"telemetry-operator-controller-manager-775776c574-h2f5k\" (UID: \"58991c0e-b29b-4851-b5b0-8327380e1320\") " pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.213490 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.213522 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6bc\" (UniqueName: \"kubernetes.io/projected/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-kube-api-access-8m6bc\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.213570 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbcd\" (UniqueName: \"kubernetes.io/projected/46a0519d-251f-48e0-9d65-4ca08c627195-kube-api-access-kzbcd\") pod \"swift-operator-controller-manager-5f4d5dfdc6-m6ldz\" (UID: \"46a0519d-251f-48e0-9d65-4ca08c627195\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.213598 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxgr\" (UniqueName: \"kubernetes.io/projected/d4193662-5c4e-40fa-ac9e-495509e75c4a-kube-api-access-4lxgr\") pod \"placement-operator-controller-manager-664664cb68-bjfjf\" (UID: \"d4193662-5c4e-40fa-ac9e-495509e75c4a\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:16 crc kubenswrapper[4669]: E1008 20:58:16.213962 4669 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 20:58:16 crc kubenswrapper[4669]: E1008 20:58:16.214005 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-cert podName:d44ffb7b-c761-48b2-be7c-a5af13e2a59b nodeName:}" failed. No retries permitted until 2025-10-08 20:58:16.713992322 +0000 UTC m=+816.406802995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" (UID: "d44ffb7b-c761-48b2-be7c-a5af13e2a59b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.223374 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.232237 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.232998 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxgr\" (UniqueName: \"kubernetes.io/projected/d4193662-5c4e-40fa-ac9e-495509e75c4a-kube-api-access-4lxgr\") pod \"placement-operator-controller-manager-664664cb68-bjfjf\" (UID: \"d4193662-5c4e-40fa-ac9e-495509e75c4a\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.234413 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.235027 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbcd\" (UniqueName: \"kubernetes.io/projected/46a0519d-251f-48e0-9d65-4ca08c627195-kube-api-access-kzbcd\") pod \"swift-operator-controller-manager-5f4d5dfdc6-m6ldz\" (UID: \"46a0519d-251f-48e0-9d65-4ca08c627195\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.237680 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-l4pck" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.242235 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6bc\" (UniqueName: \"kubernetes.io/projected/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-kube-api-access-8m6bc\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.247397 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.254204 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.265360 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.284098 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.287667 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.288885 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.292263 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4q68r" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.307251 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.314079 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95kx\" (UniqueName: \"kubernetes.io/projected/19754c8c-8fbf-4b8e-b673-462e22ec11d1-kube-api-access-w95kx\") pod \"test-operator-controller-manager-74665f6cdc-tw98s\" (UID: \"19754c8c-8fbf-4b8e-b673-462e22ec11d1\") " pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.314200 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjg9m\" (UniqueName: \"kubernetes.io/projected/76773a34-db21-4354-a16e-e70ea0d6d63d-kube-api-access-wjg9m\") pod \"watcher-operator-controller-manager-5dd4499c96-qfbh5\" (UID: \"76773a34-db21-4354-a16e-e70ea0d6d63d\") " pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.314285 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9f5m\" (UniqueName: \"kubernetes.io/projected/58991c0e-b29b-4851-b5b0-8327380e1320-kube-api-access-p9f5m\") pod \"telemetry-operator-controller-manager-775776c574-h2f5k\" (UID: \"58991c0e-b29b-4851-b5b0-8327380e1320\") " pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.314376 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-cert\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.321244 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01d6ae8d-9f65-4a30-80fb-135e4eba5a10-cert\") pod \"infra-operator-controller-manager-585fc5b659-c2w55\" (UID: \"01d6ae8d-9f65-4a30-80fb-135e4eba5a10\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.353487 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.354682 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.357225 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.358063 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9f5m\" (UniqueName: \"kubernetes.io/projected/58991c0e-b29b-4851-b5b0-8327380e1320-kube-api-access-p9f5m\") pod \"telemetry-operator-controller-manager-775776c574-h2f5k\" (UID: \"58991c0e-b29b-4851-b5b0-8327380e1320\") " pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.361610 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.363454 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lgbwp" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.373832 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.378287 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.380361 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-lx75l" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.384574 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.386070 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.401149 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.415381 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htsvs\" (UniqueName: \"kubernetes.io/projected/3dce5b40-fa36-4a03-bea2-a19e6267ecec-kube-api-access-htsvs\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-h7k72\" (UID: \"3dce5b40-fa36-4a03-bea2-a19e6267ecec\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.415423 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb2409f1-4af4-49a3-a453-29e8f447360e-cert\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.415459 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m898j\" (UniqueName: \"kubernetes.io/projected/eb2409f1-4af4-49a3-a453-29e8f447360e-kube-api-access-m898j\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.415480 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95kx\" (UniqueName: \"kubernetes.io/projected/19754c8c-8fbf-4b8e-b673-462e22ec11d1-kube-api-access-w95kx\") pod \"test-operator-controller-manager-74665f6cdc-tw98s\" (UID: \"19754c8c-8fbf-4b8e-b673-462e22ec11d1\") " pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.415512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjg9m\" (UniqueName: \"kubernetes.io/projected/76773a34-db21-4354-a16e-e70ea0d6d63d-kube-api-access-wjg9m\") pod \"watcher-operator-controller-manager-5dd4499c96-qfbh5\" (UID: \"76773a34-db21-4354-a16e-e70ea0d6d63d\") " pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.465300 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95kx\" (UniqueName: \"kubernetes.io/projected/19754c8c-8fbf-4b8e-b673-462e22ec11d1-kube-api-access-w95kx\") pod \"test-operator-controller-manager-74665f6cdc-tw98s\" (UID: \"19754c8c-8fbf-4b8e-b673-462e22ec11d1\") " pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.477743 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjg9m\" (UniqueName: \"kubernetes.io/projected/76773a34-db21-4354-a16e-e70ea0d6d63d-kube-api-access-wjg9m\") pod \"watcher-operator-controller-manager-5dd4499c96-qfbh5\" (UID: \"76773a34-db21-4354-a16e-e70ea0d6d63d\") " pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.479336 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb"] Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.496189 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.517128 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htsvs\" (UniqueName: \"kubernetes.io/projected/3dce5b40-fa36-4a03-bea2-a19e6267ecec-kube-api-access-htsvs\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-h7k72\" (UID: \"3dce5b40-fa36-4a03-bea2-a19e6267ecec\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.517173 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb2409f1-4af4-49a3-a453-29e8f447360e-cert\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.517206 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m898j\" (UniqueName: \"kubernetes.io/projected/eb2409f1-4af4-49a3-a453-29e8f447360e-kube-api-access-m898j\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:16 crc kubenswrapper[4669]: E1008 20:58:16.517371 4669 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 08 20:58:16 crc kubenswrapper[4669]: E1008 20:58:16.517442 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb2409f1-4af4-49a3-a453-29e8f447360e-cert podName:eb2409f1-4af4-49a3-a453-29e8f447360e nodeName:}" failed. No retries permitted until 2025-10-08 20:58:17.01742378 +0000 UTC m=+816.710234453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb2409f1-4af4-49a3-a453-29e8f447360e-cert") pod "openstack-operator-controller-manager-6dd9d44468-66k2c" (UID: "eb2409f1-4af4-49a3-a453-29e8f447360e") : secret "webhook-server-cert" not found Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.536403 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htsvs\" (UniqueName: \"kubernetes.io/projected/3dce5b40-fa36-4a03-bea2-a19e6267ecec-kube-api-access-htsvs\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-h7k72\" (UID: \"3dce5b40-fa36-4a03-bea2-a19e6267ecec\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.541326 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m898j\" (UniqueName: \"kubernetes.io/projected/eb2409f1-4af4-49a3-a453-29e8f447360e-kube-api-access-m898j\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:16 crc kubenswrapper[4669]: W1008 20:58:16.549843 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5317b6da_a1f3_4a2c_85e4_3ef8fcb018d5.slice/crio-7bc4c0bfe8a7b3ff68aec9975b45cbcb10122dd8cf475c1937e1ef06c3797dee WatchSource:0}: Error finding container 7bc4c0bfe8a7b3ff68aec9975b45cbcb10122dd8cf475c1937e1ef06c3797dee: Status 404 returned error can't find the container with id 7bc4c0bfe8a7b3ff68aec9975b45cbcb10122dd8cf475c1937e1ef06c3797dee Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.563294 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.582333 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.614171 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.711189 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.724806 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.731686 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d44ffb7b-c761-48b2-be7c-a5af13e2a59b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8\" (UID: \"d44ffb7b-c761-48b2-be7c-a5af13e2a59b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.913861 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:16 crc kubenswrapper[4669]: I1008 20:58:16.947501 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" event={"ID":"5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5","Type":"ContainerStarted","Data":"7bc4c0bfe8a7b3ff68aec9975b45cbcb10122dd8cf475c1937e1ef06c3797dee"} Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.033275 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb2409f1-4af4-49a3-a453-29e8f447360e-cert\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.037411 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb2409f1-4af4-49a3-a453-29e8f447360e-cert\") pod \"openstack-operator-controller-manager-6dd9d44468-66k2c\" (UID: \"eb2409f1-4af4-49a3-a453-29e8f447360e\") " pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.100873 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.103662 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92c3530_f73f_45fe_84f5_bea451e1aaba.slice/crio-32d3ea39d57a46d0396a46e3345691fece9bd0f429daf9664cca6cadc1c92840 WatchSource:0}: Error finding container 32d3ea39d57a46d0396a46e3345691fece9bd0f429daf9664cca6cadc1c92840: Status 404 returned error can't find the container with id 32d3ea39d57a46d0396a46e3345691fece9bd0f429daf9664cca6cadc1c92840 Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.106817 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.111958 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.112110 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f0a0ada_acd3_452b_bd50_b5d634b906c4.slice/crio-69ba16899db332a83266bba5a8e2aa6e5b0caaf28ee0f8e9f3939b28bb3f1690 WatchSource:0}: Error finding container 69ba16899db332a83266bba5a8e2aa6e5b0caaf28ee0f8e9f3939b28bb3f1690: Status 404 returned error can't find the container with id 69ba16899db332a83266bba5a8e2aa6e5b0caaf28ee0f8e9f3939b28bb3f1690 Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.291037 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.415958 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.420187 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.427518 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.434013 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.439865 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc0d4c88_6c32_4498_8025_de3c8b59eaea.slice/crio-102dbf76241a91f1be178073fe82d4a5ad72c8061749ee8419d218300823c8f3 WatchSource:0}: Error finding container 102dbf76241a91f1be178073fe82d4a5ad72c8061749ee8419d218300823c8f3: Status 404 returned error can't find the container with id 102dbf76241a91f1be178073fe82d4a5ad72c8061749ee8419d218300823c8f3 Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.469085 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.479676 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-frtrx"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.504970 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43bc083_637e_4c93_a024_a47cecaade29.slice/crio-276236d120e1fe1e91d786c1939bbb0760ec77a11154172d4095a256caa5903c WatchSource:0}: Error finding container 276236d120e1fe1e91d786c1939bbb0760ec77a11154172d4095a256caa5903c: Status 404 returned error can't find the container with id 276236d120e1fe1e91d786c1939bbb0760ec77a11154172d4095a256caa5903c Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.509821 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.525515 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.765547 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.789904 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dce5b40_fa36_4a03_bea2_a19e6267ecec.slice/crio-34915054c27dd8935b26858defcee0c595ff340a9cd1f220f2418bb00f869b6a WatchSource:0}: Error finding container 34915054c27dd8935b26858defcee0c595ff340a9cd1f220f2418bb00f869b6a: Status 404 returned error can't find the container with id 34915054c27dd8935b26858defcee0c595ff340a9cd1f220f2418bb00f869b6a Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.790435 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.795649 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.801487 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.815738 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.823578 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k"] Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.825300 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4lxgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-664664cb68-bjfjf_openstack-operators(d4193662-5c4e-40fa-ac9e-495509e75c4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.833196 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76773a34_db21_4354_a16e_e70ea0d6d63d.slice/crio-29c5bf15854535c217aabdd01f29c2a648367ec613766b549234786aa607fdd7 WatchSource:0}: Error finding container 29c5bf15854535c217aabdd01f29c2a648367ec613766b549234786aa607fdd7: Status 404 returned error can't find the container with id 29c5bf15854535c217aabdd01f29c2a648367ec613766b549234786aa607fdd7 Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.835234 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.836447 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d7bcb6_b09e_4605_87a9_9cdaedb40c74.slice/crio-f620d1bbb4c7f47f8da975741c190c36c64fb11965f2048f9d232ec0cf641856 WatchSource:0}: Error finding container f620d1bbb4c7f47f8da975741c190c36c64fb11965f2048f9d232ec0cf641856: Status 404 returned error can't find the container with id f620d1bbb4c7f47f8da975741c190c36c64fb11965f2048f9d232ec0cf641856 Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.838127 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wjg9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5dd4499c96-qfbh5_openstack-operators(76773a34-db21-4354-a16e-e70ea0d6d63d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.838173 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:9d26476523320d70d6d457b91663e8c233ed320d77032a7c57a89ce1aedd3931,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9f5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-775776c574-h2f5k_openstack-operators(58991c0e-b29b-4851-b5b0-8327380e1320): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.844849 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c"] Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.856158 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55"] Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.858054 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ctpwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6f96f8c84-gc5qn_openstack-operators(92d7bcb6-b09e-4605-87a9-9cdaedb40c74): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.858367 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8m6bc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8_openstack-operators(d44ffb7b-c761-48b2-be7c-a5af13e2a59b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.862353 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s"] Oct 08 20:58:17 crc kubenswrapper[4669]: W1008 20:58:17.864845 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19754c8c_8fbf_4b8e_b673_462e22ec11d1.slice/crio-0251488346e6d35fd2d017194bf8440699516d43eca714145c6c318f1845a4ad WatchSource:0}: Error finding container 0251488346e6d35fd2d017194bf8440699516d43eca714145c6c318f1845a4ad: Status 404 returned error can't find the container with id 0251488346e6d35fd2d017194bf8440699516d43eca714145c6c318f1845a4ad Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.879723 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc"] Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.936889 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w95kx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-74665f6cdc-tw98s_openstack-operators(19754c8c-8fbf-4b8e-b673-462e22ec11d1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: E1008 20:58:17.974876 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mnxvj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-797d478b46-vl7tc_openstack-operators(a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.975647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" event={"ID":"4ee291fa-b998-4bfc-a689-fb66e345bcaa","Type":"ContainerStarted","Data":"e4ab85c4c25a0f435c4231246e0298c5b89090c263144f92c1879db65cf0aeab"} Oct 08 20:58:17 crc kubenswrapper[4669]: I1008 20:58:17.990114 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" event={"ID":"92d7bcb6-b09e-4605-87a9-9cdaedb40c74","Type":"ContainerStarted","Data":"f620d1bbb4c7f47f8da975741c190c36c64fb11965f2048f9d232ec0cf641856"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.008240 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" event={"ID":"f92c3530-f73f-45fe-84f5-bea451e1aaba","Type":"ContainerStarted","Data":"32d3ea39d57a46d0396a46e3345691fece9bd0f429daf9664cca6cadc1c92840"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.066150 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" event={"ID":"eb2409f1-4af4-49a3-a453-29e8f447360e","Type":"ContainerStarted","Data":"3c2f0d1a92c080c28d2b548fad88cbce00587133b636d4839e80b3db006b66a4"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.114643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" event={"ID":"3dce5b40-fa36-4a03-bea2-a19e6267ecec","Type":"ContainerStarted","Data":"34915054c27dd8935b26858defcee0c595ff340a9cd1f220f2418bb00f869b6a"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.117120 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" event={"ID":"7185d03c-648b-488d-b1d0-842f8b72e0ff","Type":"ContainerStarted","Data":"cf2c68336f90f8aad90b9d61a86da822606b9a45e2a0bcc5becffbae6ce5ce8d"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.122409 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" event={"ID":"b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45","Type":"ContainerStarted","Data":"8827e8812bec949db4718274c3a6eba7f43ca3f0c2019ccd8ddece7c0e86e417"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.123975 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" event={"ID":"dc0d4c88-6c32-4498-8025-de3c8b59eaea","Type":"ContainerStarted","Data":"102dbf76241a91f1be178073fe82d4a5ad72c8061749ee8419d218300823c8f3"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.124972 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" event={"ID":"1af3f5fc-da55-4c9e-ac87-30363f6cf741","Type":"ContainerStarted","Data":"a22f3c042ed1429fe65f882c55e674f145aa017b28bf598266d954b4c58c9d97"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.126517 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" event={"ID":"5f0a0ada-acd3-452b-bd50-b5d634b906c4","Type":"ContainerStarted","Data":"69ba16899db332a83266bba5a8e2aa6e5b0caaf28ee0f8e9f3939b28bb3f1690"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.131023 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" event={"ID":"5fceefe8-bf0f-4f2d-9e14-2208f38b73d7","Type":"ContainerStarted","Data":"a324c0e5a1d806826354dfcac36e479ffbaa3cd888464de18bb189a9d6325812"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.132370 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" event={"ID":"b43bc083-637e-4c93-a024-a47cecaade29","Type":"ContainerStarted","Data":"276236d120e1fe1e91d786c1939bbb0760ec77a11154172d4095a256caa5903c"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.133464 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" event={"ID":"01d6ae8d-9f65-4a30-80fb-135e4eba5a10","Type":"ContainerStarted","Data":"26f33d057b186f81715590eff928415c928c227e383cdb4df3d85f4822b8dbcf"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.136647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" event={"ID":"46a0519d-251f-48e0-9d65-4ca08c627195","Type":"ContainerStarted","Data":"88035af137e07c910ef67a5260274b6f3378c811ed7bff573b7fb1552fcd0888"} Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.139335 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" podUID="d4193662-5c4e-40fa-ac9e-495509e75c4a" Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.141000 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" event={"ID":"58991c0e-b29b-4851-b5b0-8327380e1320","Type":"ContainerStarted","Data":"eff2e12b308e70e71012ac208af4c75ef2ffe01179520ec23726435a7adc6cda"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.142406 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" event={"ID":"a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4","Type":"ContainerStarted","Data":"0bed4cc92c4ab8b0a55978fe28c37e4f431d3829e5dc7891543b87aa5bb884e2"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.146851 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" event={"ID":"d44ffb7b-c761-48b2-be7c-a5af13e2a59b","Type":"ContainerStarted","Data":"aafebcb17f8db1d6fdd4ffa92f198a0e36ef1d7f0e0eeb689942c7698002ec7f"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.148299 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" event={"ID":"0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a","Type":"ContainerStarted","Data":"9cae9752e4d02c8f7dac7b46011f059705f2a7406b5275f3acb3c0945c99ed86"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.159939 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" event={"ID":"19754c8c-8fbf-4b8e-b673-462e22ec11d1","Type":"ContainerStarted","Data":"0251488346e6d35fd2d017194bf8440699516d43eca714145c6c318f1845a4ad"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.162185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" event={"ID":"76773a34-db21-4354-a16e-e70ea0d6d63d","Type":"ContainerStarted","Data":"29c5bf15854535c217aabdd01f29c2a648367ec613766b549234786aa607fdd7"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.164881 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" event={"ID":"d4193662-5c4e-40fa-ac9e-495509e75c4a","Type":"ContainerStarted","Data":"d7a6be4ada5223191f92054b78f0bff8d9b117bcd0173402ac0874bd4689e483"} Oct 08 20:58:18 crc kubenswrapper[4669]: I1008 20:58:18.176393 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" event={"ID":"d4ee300d-b78b-4052-83a9-4ab8ca569886","Type":"ContainerStarted","Data":"a30bed20b351cd44da43fc7cee3dfac1c572ab8fb10dc54ffedda30e33df1008"} Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.194712 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" podUID="d4193662-5c4e-40fa-ac9e-495509e75c4a" Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.248984 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" podUID="58991c0e-b29b-4851-b5b0-8327380e1320" Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.322414 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" podUID="76773a34-db21-4354-a16e-e70ea0d6d63d" Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.403144 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" podUID="92d7bcb6-b09e-4605-87a9-9cdaedb40c74" Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.448421 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" podUID="a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4" Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.451976 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" podUID="d44ffb7b-c761-48b2-be7c-a5af13e2a59b" Oct 08 20:58:18 crc kubenswrapper[4669]: E1008 20:58:18.469221 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" podUID="19754c8c-8fbf-4b8e-b673-462e22ec11d1" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.209804 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" event={"ID":"92d7bcb6-b09e-4605-87a9-9cdaedb40c74","Type":"ContainerStarted","Data":"afd22a6b276871bdc673fec9c489dea693dd97c7ea5f9774394bc0cdfc73c479"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.213325 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" podUID="92d7bcb6-b09e-4605-87a9-9cdaedb40c74" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.220739 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" event={"ID":"19754c8c-8fbf-4b8e-b673-462e22ec11d1","Type":"ContainerStarted","Data":"359976dbc7fb0f220b35582f7a88bfc893f04b5bd594231a12d4765a7820ff40"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.223017 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9\\\"\"" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" podUID="19754c8c-8fbf-4b8e-b673-462e22ec11d1" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.224602 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" event={"ID":"76773a34-db21-4354-a16e-e70ea0d6d63d","Type":"ContainerStarted","Data":"f5eac574768aa0974106241b60b877b3e7b31f08dd7d96142a362f927fcd0bf6"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.237050 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" podUID="76773a34-db21-4354-a16e-e70ea0d6d63d" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.239135 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" event={"ID":"58991c0e-b29b-4851-b5b0-8327380e1320","Type":"ContainerStarted","Data":"3f12fc12d616dd2e08127f2f7adcade252113d45d326ecd20c46b4ccb029625e"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.240738 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:9d26476523320d70d6d457b91663e8c233ed320d77032a7c57a89ce1aedd3931\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" podUID="58991c0e-b29b-4851-b5b0-8327380e1320" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.260244 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" event={"ID":"d4193662-5c4e-40fa-ac9e-495509e75c4a","Type":"ContainerStarted","Data":"469ae312ae350ac31b86d354aede09b020e35fe131b1f829ded6e318c97b5b12"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.280836 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" podUID="d4193662-5c4e-40fa-ac9e-495509e75c4a" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.295734 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" event={"ID":"eb2409f1-4af4-49a3-a453-29e8f447360e","Type":"ContainerStarted","Data":"53e4dc741e1e4ac329cea10178fb910df396aed95d1f29d54c90460ecd3450f9"} Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.295783 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" event={"ID":"eb2409f1-4af4-49a3-a453-29e8f447360e","Type":"ContainerStarted","Data":"9329d98256c5fc66f949cf0ca2e3641ae90a9fc0ffc6602a2b9a85114dcf65cc"} Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.296333 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.301688 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" event={"ID":"a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4","Type":"ContainerStarted","Data":"1e6735aab5319f97e3220d9400dee0dea5da9b9b9c80de583f7ab785a5aea88f"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.309261 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" podUID="a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.325574 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" event={"ID":"d44ffb7b-c761-48b2-be7c-a5af13e2a59b","Type":"ContainerStarted","Data":"f542866d5b6f94ad10a38ddeb8b46d2ddd5284530677f6db5147f6abac552f28"} Oct 08 20:58:19 crc kubenswrapper[4669]: E1008 20:58:19.328104 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" podUID="d44ffb7b-c761-48b2-be7c-a5af13e2a59b" Oct 08 20:58:19 crc kubenswrapper[4669]: I1008 20:58:19.436138 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" podStartSLOduration=3.436118029 podStartE2EDuration="3.436118029s" podCreationTimestamp="2025-10-08 20:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:58:19.42995356 +0000 UTC m=+819.122764233" watchObservedRunningTime="2025-10-08 20:58:19.436118029 +0000 UTC m=+819.128928712" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.336116 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" podUID="a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.337388 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" podUID="92d7bcb6-b09e-4605-87a9-9cdaedb40c74" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.337404 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" podUID="d4193662-5c4e-40fa-ac9e-495509e75c4a" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.337392 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" podUID="d44ffb7b-c761-48b2-be7c-a5af13e2a59b" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.337597 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" podUID="76773a34-db21-4354-a16e-e70ea0d6d63d" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.337641 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9\\\"\"" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" podUID="19754c8c-8fbf-4b8e-b673-462e22ec11d1" Oct 08 20:58:20 crc kubenswrapper[4669]: E1008 20:58:20.339926 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:9d26476523320d70d6d457b91663e8c233ed320d77032a7c57a89ce1aedd3931\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" podUID="58991c0e-b29b-4851-b5b0-8327380e1320" Oct 08 20:58:27 crc kubenswrapper[4669]: I1008 20:58:27.297625 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6dd9d44468-66k2c" Oct 08 20:58:29 crc kubenswrapper[4669]: I1008 20:58:29.432992 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" event={"ID":"1af3f5fc-da55-4c9e-ac87-30363f6cf741","Type":"ContainerStarted","Data":"2b8433d3418d40de3c52fb9863a5a7bb939cbdc05047fc66873d32d333f0b49d"} Oct 08 20:58:29 crc kubenswrapper[4669]: I1008 20:58:29.439549 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" event={"ID":"d4ee300d-b78b-4052-83a9-4ab8ca569886","Type":"ContainerStarted","Data":"883d0cf671ae0af932d928bf0a87fa5c4bf4c6cb1df720a944f2b006c42c6985"} Oct 08 20:58:29 crc kubenswrapper[4669]: I1008 20:58:29.440937 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" event={"ID":"4ee291fa-b998-4bfc-a689-fb66e345bcaa","Type":"ContainerStarted","Data":"e4b8ea0037a595bc2027ccef701d48969e9d099efb4c604bb0dc8ec52ba77dd5"} Oct 08 20:58:29 crc kubenswrapper[4669]: I1008 20:58:29.446476 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" event={"ID":"01d6ae8d-9f65-4a30-80fb-135e4eba5a10","Type":"ContainerStarted","Data":"1c57783714e829a828b662df17f6e5891681a03864f5ce58b14007564d250b3e"} Oct 08 20:58:29 crc kubenswrapper[4669]: I1008 20:58:29.449012 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" event={"ID":"46a0519d-251f-48e0-9d65-4ca08c627195","Type":"ContainerStarted","Data":"1938a3a0384e08dbe5060335339326c2f548fd7d7e363f7db0f1b0d81cefad11"} Oct 08 20:58:29 crc kubenswrapper[4669]: I1008 20:58:29.456127 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" event={"ID":"5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5","Type":"ContainerStarted","Data":"f32f0eba5bc7a771b6a5a9c36a4e5fd0eb20aab2af8f01adad4f088bc3c5a9f9"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.462777 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" event={"ID":"7185d03c-648b-488d-b1d0-842f8b72e0ff","Type":"ContainerStarted","Data":"6731c91bf8c63e8dcaff7b5b028d0f381d010467879c574a9b575b3b449a94bd"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.464242 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" event={"ID":"4ee291fa-b998-4bfc-a689-fb66e345bcaa","Type":"ContainerStarted","Data":"2c4b33b8a284ee98e7f20a699bcd3f08a47fe84b22f6d0cca9b62a9df4bf6f64"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.465157 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.467401 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" event={"ID":"d4ee300d-b78b-4052-83a9-4ab8ca569886","Type":"ContainerStarted","Data":"066521ad9f757e801d05cb5c55649ea0278d7f3c1f76b51c8d18e86ff7cc25a5"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.467590 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.468717 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" event={"ID":"b43bc083-637e-4c93-a024-a47cecaade29","Type":"ContainerStarted","Data":"fe2ee023275b982342080a758a1141eafd82f67458ee29844768bd30694af1e6"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.470167 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" event={"ID":"01d6ae8d-9f65-4a30-80fb-135e4eba5a10","Type":"ContainerStarted","Data":"e4dd8bf18400472192aa901c6266199cc1e3476e7b30dfc51b26ce10c38ca1f9"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.470671 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.473610 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" event={"ID":"b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45","Type":"ContainerStarted","Data":"1f6c3743404632b1fb99960d1c8804bf2b758ccea105c443bce7eb78e8d2ee7a"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.476438 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" event={"ID":"0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a","Type":"ContainerStarted","Data":"edd8751fd1de0141bb0c3d69ce74fcdec00c9dbc05620213d296bfbe589d166f"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.478281 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" event={"ID":"5fceefe8-bf0f-4f2d-9e14-2208f38b73d7","Type":"ContainerStarted","Data":"b124a19b607e7da0a7468049df61d9dbd8a7420c594c4d2c8b14033e4a3da365"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.479793 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" event={"ID":"3dce5b40-fa36-4a03-bea2-a19e6267ecec","Type":"ContainerStarted","Data":"e0cba9a0442fce76ae22921946687a0a6bd473ac7d67cc392a513b01b83cbcf4"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.483251 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" event={"ID":"5f0a0ada-acd3-452b-bd50-b5d634b906c4","Type":"ContainerStarted","Data":"2a7d7f6f5c1b7bf410147aa7cb526b63f917876832ce5f2343cdadcadd1ab616"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.483339 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" event={"ID":"5f0a0ada-acd3-452b-bd50-b5d634b906c4","Type":"ContainerStarted","Data":"85779cf4655c4995869225c1b225fb5d807619556ce175a886490d6decc1df8e"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.483769 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.488295 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" podStartSLOduration=4.098133382 podStartE2EDuration="15.488284907s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.498966404 +0000 UTC m=+817.191777077" lastFinishedPulling="2025-10-08 20:58:28.889117929 +0000 UTC m=+828.581928602" observedRunningTime="2025-10-08 20:58:30.483043902 +0000 UTC m=+830.175854575" watchObservedRunningTime="2025-10-08 20:58:30.488284907 +0000 UTC m=+830.181095580" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.490795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" event={"ID":"46a0519d-251f-48e0-9d65-4ca08c627195","Type":"ContainerStarted","Data":"c37791a02d17989c27c99e4f51936a631ba34b71585e32de3cc782f0a867b803"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.490866 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.493543 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" event={"ID":"dc0d4c88-6c32-4498-8025-de3c8b59eaea","Type":"ContainerStarted","Data":"2dd47af814e005a0bfacbde0f842e2b04567b9bc3fd23c2b9f4b0523e3bafe1d"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.502595 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" event={"ID":"5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5","Type":"ContainerStarted","Data":"51c3bbfa910fd6120bd24663c5076bbd43eccc5b8b437f0080fa4da51714c762"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.503227 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.512383 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" event={"ID":"f92c3530-f73f-45fe-84f5-bea451e1aaba","Type":"ContainerStarted","Data":"7ce57a32c0cd7f3f849e381d1275a0dc7869f941803e0c82af631422e85e68f3"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.512880 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" podStartSLOduration=4.578366688 podStartE2EDuration="15.512869371s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.93584677 +0000 UTC m=+817.628657453" lastFinishedPulling="2025-10-08 20:58:28.870349463 +0000 UTC m=+828.563160136" observedRunningTime="2025-10-08 20:58:30.509446717 +0000 UTC m=+830.202257390" watchObservedRunningTime="2025-10-08 20:58:30.512869371 +0000 UTC m=+830.205680044" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.513051 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.523588 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" event={"ID":"1af3f5fc-da55-4c9e-ac87-30363f6cf741","Type":"ContainerStarted","Data":"b392fa410c7ec073147f06da7a1415a4f7415065a56d503ee1b21b1b58004e37"} Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.524195 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.569196 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-h7k72" podStartSLOduration=3.469811945 podStartE2EDuration="14.569182719s" podCreationTimestamp="2025-10-08 20:58:16 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.809407876 +0000 UTC m=+817.502218549" lastFinishedPulling="2025-10-08 20:58:28.90877863 +0000 UTC m=+828.601589323" observedRunningTime="2025-10-08 20:58:30.565073746 +0000 UTC m=+830.257884419" watchObservedRunningTime="2025-10-08 20:58:30.569182719 +0000 UTC m=+830.261993392" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.571422 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" podStartSLOduration=4.180228496 podStartE2EDuration="15.57141721s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.494959514 +0000 UTC m=+817.187770187" lastFinishedPulling="2025-10-08 20:58:28.886148208 +0000 UTC m=+828.578958901" observedRunningTime="2025-10-08 20:58:30.542080355 +0000 UTC m=+830.234891028" watchObservedRunningTime="2025-10-08 20:58:30.57141721 +0000 UTC m=+830.264227873" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.593725 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" podStartSLOduration=3.303573146 podStartE2EDuration="15.593709893s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:16.557096381 +0000 UTC m=+816.249907054" lastFinishedPulling="2025-10-08 20:58:28.847233128 +0000 UTC m=+828.540043801" observedRunningTime="2025-10-08 20:58:30.588590242 +0000 UTC m=+830.281400915" watchObservedRunningTime="2025-10-08 20:58:30.593709893 +0000 UTC m=+830.286520566" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.615827 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" podStartSLOduration=4.531360007 podStartE2EDuration="15.615808371s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.780627504 +0000 UTC m=+817.473438177" lastFinishedPulling="2025-10-08 20:58:28.865075868 +0000 UTC m=+828.557886541" observedRunningTime="2025-10-08 20:58:30.610903645 +0000 UTC m=+830.303714318" watchObservedRunningTime="2025-10-08 20:58:30.615808371 +0000 UTC m=+830.308619044" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.639890 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" podStartSLOduration=3.8694970079999997 podStartE2EDuration="15.639875812s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.117005097 +0000 UTC m=+816.809815770" lastFinishedPulling="2025-10-08 20:58:28.887383901 +0000 UTC m=+828.580194574" observedRunningTime="2025-10-08 20:58:30.636119378 +0000 UTC m=+830.328930051" watchObservedRunningTime="2025-10-08 20:58:30.639875812 +0000 UTC m=+830.332686485" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.686063 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" podStartSLOduration=4.290311582 podStartE2EDuration="15.686045311s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.486548302 +0000 UTC m=+817.179358975" lastFinishedPulling="2025-10-08 20:58:28.882282031 +0000 UTC m=+828.575092704" observedRunningTime="2025-10-08 20:58:30.685057554 +0000 UTC m=+830.377868227" watchObservedRunningTime="2025-10-08 20:58:30.686045311 +0000 UTC m=+830.378855984" Oct 08 20:58:30 crc kubenswrapper[4669]: I1008 20:58:30.689372 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" podStartSLOduration=3.913107816 podStartE2EDuration="15.689358772s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.110902359 +0000 UTC m=+816.803713032" lastFinishedPulling="2025-10-08 20:58:28.887153305 +0000 UTC m=+828.579963988" observedRunningTime="2025-10-08 20:58:30.663700487 +0000 UTC m=+830.356511160" watchObservedRunningTime="2025-10-08 20:58:30.689358772 +0000 UTC m=+830.382169445" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.532542 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" event={"ID":"dc0d4c88-6c32-4498-8025-de3c8b59eaea","Type":"ContainerStarted","Data":"f000e9c27851ae56036aa7912bae092e5a7a2032c7ba789cffeafe9f39c3a437"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.533013 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.534303 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" event={"ID":"f92c3530-f73f-45fe-84f5-bea451e1aaba","Type":"ContainerStarted","Data":"6fb5bb41e0b15116112f8c28ef1655f2d77cbbc8a8c7066748dcbd2312d5ef00"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.536609 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" event={"ID":"0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a","Type":"ContainerStarted","Data":"b68ac462ac5882a125ca24e62042ce3ea8b460446439bff8daf5b37e3e01c71b"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.536772 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.539223 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" event={"ID":"5fceefe8-bf0f-4f2d-9e14-2208f38b73d7","Type":"ContainerStarted","Data":"7ebd8dfddf4b2e088d0b7c49c2d8523d75fbae45bf27b4bc3fa790ce7619baca"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.539380 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.554886 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" event={"ID":"b43bc083-637e-4c93-a024-a47cecaade29","Type":"ContainerStarted","Data":"0f0736578b32bfd5276d6a20726d7f7b18e9f7ec313fcc3a79a06c1fa6c3c721"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.555867 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.559982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" event={"ID":"7185d03c-648b-488d-b1d0-842f8b72e0ff","Type":"ContainerStarted","Data":"a753400e4f8d28a6d60b2ae9689569d39fdd007147467808b82578d725996add"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.560247 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.562157 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" podStartSLOduration=5.138709918 podStartE2EDuration="16.562130817s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.486488581 +0000 UTC m=+817.179299254" lastFinishedPulling="2025-10-08 20:58:28.90990948 +0000 UTC m=+828.602720153" observedRunningTime="2025-10-08 20:58:31.555989148 +0000 UTC m=+831.248799821" watchObservedRunningTime="2025-10-08 20:58:31.562130817 +0000 UTC m=+831.254941490" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.563182 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" event={"ID":"b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45","Type":"ContainerStarted","Data":"1a25c0b4a8fa12387d70d894c1ee1e08f0878901f93f0ab9e6b7716403d0503d"} Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.567707 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.599437 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" podStartSLOduration=5.18249541 podStartE2EDuration="16.5993875s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.49519016 +0000 UTC m=+817.188000833" lastFinishedPulling="2025-10-08 20:58:28.91208223 +0000 UTC m=+828.604892923" observedRunningTime="2025-10-08 20:58:31.574671591 +0000 UTC m=+831.267482284" watchObservedRunningTime="2025-10-08 20:58:31.5993875 +0000 UTC m=+831.292198173" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.607944 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" podStartSLOduration=5.235016144 podStartE2EDuration="16.607925805s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.514770008 +0000 UTC m=+817.207580681" lastFinishedPulling="2025-10-08 20:58:28.887679649 +0000 UTC m=+828.580490342" observedRunningTime="2025-10-08 20:58:31.588355087 +0000 UTC m=+831.281165790" watchObservedRunningTime="2025-10-08 20:58:31.607925805 +0000 UTC m=+831.300736478" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.613618 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" podStartSLOduration=5.221349168 podStartE2EDuration="16.613606741s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.494165962 +0000 UTC m=+817.186976635" lastFinishedPulling="2025-10-08 20:58:28.886423535 +0000 UTC m=+828.579234208" observedRunningTime="2025-10-08 20:58:31.601391966 +0000 UTC m=+831.294202649" watchObservedRunningTime="2025-10-08 20:58:31.613606741 +0000 UTC m=+831.306417414" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.622096 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" podStartSLOduration=5.20939836 podStartE2EDuration="16.622050043s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.49737264 +0000 UTC m=+817.190183313" lastFinishedPulling="2025-10-08 20:58:28.910024323 +0000 UTC m=+828.602834996" observedRunningTime="2025-10-08 20:58:31.613275492 +0000 UTC m=+831.306086175" watchObservedRunningTime="2025-10-08 20:58:31.622050043 +0000 UTC m=+831.314860726" Oct 08 20:58:31 crc kubenswrapper[4669]: I1008 20:58:31.636228 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" podStartSLOduration=4.86550581 podStartE2EDuration="16.636205622s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.111476216 +0000 UTC m=+816.804286889" lastFinishedPulling="2025-10-08 20:58:28.882176028 +0000 UTC m=+828.574986701" observedRunningTime="2025-10-08 20:58:31.630616178 +0000 UTC m=+831.323426861" watchObservedRunningTime="2025-10-08 20:58:31.636205622 +0000 UTC m=+831.329016295" Oct 08 20:58:33 crc kubenswrapper[4669]: I1008 20:58:33.581202 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" event={"ID":"19754c8c-8fbf-4b8e-b673-462e22ec11d1","Type":"ContainerStarted","Data":"a3f1b88f608951312f66b9923982855d8256f096ec548a3eda5272e1a426e0f6"} Oct 08 20:58:33 crc kubenswrapper[4669]: I1008 20:58:33.582069 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:33 crc kubenswrapper[4669]: I1008 20:58:33.608981 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" podStartSLOduration=3.423696027 podStartE2EDuration="18.608959046s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.936783236 +0000 UTC m=+817.629593909" lastFinishedPulling="2025-10-08 20:58:33.122046255 +0000 UTC m=+832.814856928" observedRunningTime="2025-10-08 20:58:33.596814732 +0000 UTC m=+833.289625445" watchObservedRunningTime="2025-10-08 20:58:33.608959046 +0000 UTC m=+833.301769719" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.690185 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rtmfb" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.710016 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-6sdlr" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.732554 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-m9v76" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.758273 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pjtc2" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.813961 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-w2mfj" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.849816 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-mpbbj" Oct 08 20:58:35 crc kubenswrapper[4669]: I1008 20:58:35.994684 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-hwmh2" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.066981 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6mn5l" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.117853 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-frtrx" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.137636 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-5rs85" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.205272 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-pb4g2" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.259730 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-9vh5w" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.390126 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-m6ldz" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.501059 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-c2w55" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.606843 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" event={"ID":"58991c0e-b29b-4851-b5b0-8327380e1320","Type":"ContainerStarted","Data":"9a2bfdfc6b7011bdb5e4f6f3c527b3b3ef6deb4756f515429b97d87694dc5af5"} Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.607181 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.609106 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" event={"ID":"d4193662-5c4e-40fa-ac9e-495509e75c4a","Type":"ContainerStarted","Data":"7b49f1744347e1d25589375a955603b921e608f9ee16fbdb02236f4825174688"} Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.609341 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.610954 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" event={"ID":"a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4","Type":"ContainerStarted","Data":"21df07d7009145eda004d4e9eea864c204800c6e664dbe29751538f817885cec"} Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.611083 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.612784 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" event={"ID":"d44ffb7b-c761-48b2-be7c-a5af13e2a59b","Type":"ContainerStarted","Data":"87a7d132b6f71e8a6b099195153d7a4c720cc346c930c1073fe57ae4b74a92fa"} Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.612963 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.624163 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" podStartSLOduration=3.846441514 podStartE2EDuration="21.624147347s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.838040882 +0000 UTC m=+817.530851555" lastFinishedPulling="2025-10-08 20:58:35.615746715 +0000 UTC m=+835.308557388" observedRunningTime="2025-10-08 20:58:36.621159635 +0000 UTC m=+836.313970308" watchObservedRunningTime="2025-10-08 20:58:36.624147347 +0000 UTC m=+836.316958020" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.655258 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" podStartSLOduration=3.820568963 podStartE2EDuration="21.655241831s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.858100363 +0000 UTC m=+817.550911036" lastFinishedPulling="2025-10-08 20:58:35.692773221 +0000 UTC m=+835.385583904" observedRunningTime="2025-10-08 20:58:36.646661756 +0000 UTC m=+836.339472439" watchObservedRunningTime="2025-10-08 20:58:36.655241831 +0000 UTC m=+836.348052504" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.691490 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" podStartSLOduration=3.8993590989999998 podStartE2EDuration="21.691471327s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.825185729 +0000 UTC m=+817.517996392" lastFinishedPulling="2025-10-08 20:58:35.617297947 +0000 UTC m=+835.310108620" observedRunningTime="2025-10-08 20:58:36.688506336 +0000 UTC m=+836.381317019" watchObservedRunningTime="2025-10-08 20:58:36.691471327 +0000 UTC m=+836.384282010" Oct 08 20:58:36 crc kubenswrapper[4669]: I1008 20:58:36.693837 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" podStartSLOduration=4.053252788 podStartE2EDuration="21.693828112s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.974741009 +0000 UTC m=+817.667551682" lastFinishedPulling="2025-10-08 20:58:35.615316323 +0000 UTC m=+835.308127006" observedRunningTime="2025-10-08 20:58:36.667856208 +0000 UTC m=+836.360666891" watchObservedRunningTime="2025-10-08 20:58:36.693828112 +0000 UTC m=+836.386638795" Oct 08 20:58:37 crc kubenswrapper[4669]: I1008 20:58:37.626484 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" event={"ID":"92d7bcb6-b09e-4605-87a9-9cdaedb40c74","Type":"ContainerStarted","Data":"7bbc2b2d1cb9c4e9f035f2689a2b10ae579f24e07fb05d0b48f75cde36cae917"} Oct 08 20:58:37 crc kubenswrapper[4669]: I1008 20:58:37.627109 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:37 crc kubenswrapper[4669]: I1008 20:58:37.629889 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" event={"ID":"76773a34-db21-4354-a16e-e70ea0d6d63d","Type":"ContainerStarted","Data":"0630220e1e432ffc6e3eddd8f284b76a2873e4372e3feb1b6f34f83eb61b3f89"} Oct 08 20:58:37 crc kubenswrapper[4669]: I1008 20:58:37.660523 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" podStartSLOduration=3.252182713 podStartE2EDuration="22.660493527s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.857944889 +0000 UTC m=+817.550755562" lastFinishedPulling="2025-10-08 20:58:37.266255693 +0000 UTC m=+836.959066376" observedRunningTime="2025-10-08 20:58:37.653370611 +0000 UTC m=+837.346181334" watchObservedRunningTime="2025-10-08 20:58:37.660493527 +0000 UTC m=+837.353304250" Oct 08 20:58:37 crc kubenswrapper[4669]: I1008 20:58:37.679915 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" podStartSLOduration=3.232080651 podStartE2EDuration="22.67988873s" podCreationTimestamp="2025-10-08 20:58:15 +0000 UTC" firstStartedPulling="2025-10-08 20:58:17.838021522 +0000 UTC m=+817.530832195" lastFinishedPulling="2025-10-08 20:58:37.285829581 +0000 UTC m=+836.978640274" observedRunningTime="2025-10-08 20:58:37.678911053 +0000 UTC m=+837.371721766" watchObservedRunningTime="2025-10-08 20:58:37.67988873 +0000 UTC m=+837.372699443" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.202799 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-vl7tc" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.289697 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-gc5qn" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.405603 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-bjfjf" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.569869 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-775776c574-h2f5k" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.586599 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-tw98s" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.628470 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.634330 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-qfbh5" Oct 08 20:58:46 crc kubenswrapper[4669]: I1008 20:58:46.923606 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.754056 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wt5ln"] Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.757078 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.765216 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.767765 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.767797 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wj5xh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.767832 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.781958 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wt5ln"] Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.819275 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jksdh"] Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.821947 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.825111 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.832934 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jksdh"] Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.880251 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-config\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.880314 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzn7\" (UniqueName: \"kubernetes.io/projected/0ab22cf7-c83f-4a32-abc1-b53715898e21-kube-api-access-pbzn7\") pod \"dnsmasq-dns-675f4bcbfc-wt5ln\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.880350 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab22cf7-c83f-4a32-abc1-b53715898e21-config\") pod \"dnsmasq-dns-675f4bcbfc-wt5ln\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.880413 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.880440 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwmc\" (UniqueName: \"kubernetes.io/projected/9229843b-95c5-4135-8383-7617689bd4ad-kube-api-access-9fwmc\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.981140 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-config\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.981198 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzn7\" (UniqueName: \"kubernetes.io/projected/0ab22cf7-c83f-4a32-abc1-b53715898e21-kube-api-access-pbzn7\") pod \"dnsmasq-dns-675f4bcbfc-wt5ln\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.981220 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab22cf7-c83f-4a32-abc1-b53715898e21-config\") pod \"dnsmasq-dns-675f4bcbfc-wt5ln\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.981265 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.981287 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwmc\" (UniqueName: \"kubernetes.io/projected/9229843b-95c5-4135-8383-7617689bd4ad-kube-api-access-9fwmc\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.982249 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.982483 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-config\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:03 crc kubenswrapper[4669]: I1008 20:59:03.982512 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab22cf7-c83f-4a32-abc1-b53715898e21-config\") pod \"dnsmasq-dns-675f4bcbfc-wt5ln\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.001082 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwmc\" (UniqueName: \"kubernetes.io/projected/9229843b-95c5-4135-8383-7617689bd4ad-kube-api-access-9fwmc\") pod \"dnsmasq-dns-78dd6ddcc-jksdh\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.002484 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzn7\" (UniqueName: \"kubernetes.io/projected/0ab22cf7-c83f-4a32-abc1-b53715898e21-kube-api-access-pbzn7\") pod \"dnsmasq-dns-675f4bcbfc-wt5ln\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.082223 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.136778 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.525888 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wt5ln"] Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.605347 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jksdh"] Oct 08 20:59:04 crc kubenswrapper[4669]: W1008 20:59:04.611449 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9229843b_95c5_4135_8383_7617689bd4ad.slice/crio-89a081775d908810d9b37a922af65962e1f558e4b2ccf7e5ecc43e762906878a WatchSource:0}: Error finding container 89a081775d908810d9b37a922af65962e1f558e4b2ccf7e5ecc43e762906878a: Status 404 returned error can't find the container with id 89a081775d908810d9b37a922af65962e1f558e4b2ccf7e5ecc43e762906878a Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.865090 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" event={"ID":"9229843b-95c5-4135-8383-7617689bd4ad","Type":"ContainerStarted","Data":"89a081775d908810d9b37a922af65962e1f558e4b2ccf7e5ecc43e762906878a"} Oct 08 20:59:04 crc kubenswrapper[4669]: I1008 20:59:04.866826 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" event={"ID":"0ab22cf7-c83f-4a32-abc1-b53715898e21","Type":"ContainerStarted","Data":"855de2f75c497b09624143ad062c87428033a0cf8db944564aa250e19969ba31"} Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.633280 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wt5ln"] Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.650982 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nfnp7"] Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.652127 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.673171 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nfnp7"] Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.722433 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.722517 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crrgz\" (UniqueName: \"kubernetes.io/projected/f476231d-6e83-47b6-9fa4-4714af712b0c-kube-api-access-crrgz\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.722600 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-config\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.824246 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.824326 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crrgz\" (UniqueName: \"kubernetes.io/projected/f476231d-6e83-47b6-9fa4-4714af712b0c-kube-api-access-crrgz\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.824383 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-config\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.825310 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-config\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.825424 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.843709 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crrgz\" (UniqueName: \"kubernetes.io/projected/f476231d-6e83-47b6-9fa4-4714af712b0c-kube-api-access-crrgz\") pod \"dnsmasq-dns-666b6646f7-nfnp7\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.970863 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jksdh"] Oct 08 20:59:06 crc kubenswrapper[4669]: I1008 20:59:06.984075 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.019776 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9lfvd"] Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.023219 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.032496 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9lfvd"] Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.130064 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-config\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.130137 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.130169 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpnth\" (UniqueName: \"kubernetes.io/projected/b60a2dd2-7028-42a7-8ef9-38099b6f8817-kube-api-access-qpnth\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.231589 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-config\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.231657 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.231689 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpnth\" (UniqueName: \"kubernetes.io/projected/b60a2dd2-7028-42a7-8ef9-38099b6f8817-kube-api-access-qpnth\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.232792 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-config\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.233259 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.264208 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpnth\" (UniqueName: \"kubernetes.io/projected/b60a2dd2-7028-42a7-8ef9-38099b6f8817-kube-api-access-qpnth\") pod \"dnsmasq-dns-57d769cc4f-9lfvd\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.349625 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.806589 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.814741 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.819902 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.819999 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.820034 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.823624 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.823698 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zwm8p" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.823703 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.823745 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.839177 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943267 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943335 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943361 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a63d3545-a64d-4c9a-9198-bf11fc782cc6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943387 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943413 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-config-data\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943436 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943471 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943492 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943513 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a63d3545-a64d-4c9a-9198-bf11fc782cc6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:07 crc kubenswrapper[4669]: I1008 20:59:07.943619 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l5w7\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-kube-api-access-5l5w7\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045310 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045363 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l5w7\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-kube-api-access-5l5w7\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045411 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045436 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045457 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a63d3545-a64d-4c9a-9198-bf11fc782cc6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045477 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045495 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-config-data\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045514 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045555 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045569 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045582 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a63d3545-a64d-4c9a-9198-bf11fc782cc6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.045991 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.046472 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.046597 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.046972 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-config-data\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.047056 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.047615 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.049788 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a63d3545-a64d-4c9a-9198-bf11fc782cc6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.050311 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.058074 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.059095 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a63d3545-a64d-4c9a-9198-bf11fc782cc6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.062101 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l5w7\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-kube-api-access-5l5w7\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.069817 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.113109 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.115107 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.116812 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.117854 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.117876 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xqglp" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.118350 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.118489 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.118619 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.119447 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.129632 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.142890 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.247782 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzdxt\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-kube-api-access-qzdxt\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.247826 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4f648df-77f7-4480-8b46-3f776880db17-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.247869 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.247889 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.247942 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.248000 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.248039 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.248058 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.248097 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4f648df-77f7-4480-8b46-3f776880db17-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.248152 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.248185 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzdxt\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-kube-api-access-qzdxt\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349872 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4f648df-77f7-4480-8b46-3f776880db17-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349895 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349947 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349972 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.349994 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.350013 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.350039 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4f648df-77f7-4480-8b46-3f776880db17-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.350064 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.350082 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.351089 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.350312 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.351556 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.351816 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.352365 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.353088 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.357088 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4f648df-77f7-4480-8b46-3f776880db17-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.360596 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4f648df-77f7-4480-8b46-3f776880db17-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.363407 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.363901 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.368044 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzdxt\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-kube-api-access-qzdxt\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.377439 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:08 crc kubenswrapper[4669]: I1008 20:59:08.454089 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.785317 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.786718 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.798542 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.798618 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.798798 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.804365 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-98kmw" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.804682 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.810080 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.812652 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901375 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-secrets\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901572 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901635 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901662 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmlk\" (UniqueName: \"kubernetes.io/projected/33dac706-1170-45a3-8151-a6ee9bce8005-kube-api-access-bgmlk\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901753 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-config-data-default\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901784 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.901974 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33dac706-1170-45a3-8151-a6ee9bce8005-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:10 crc kubenswrapper[4669]: I1008 20:59:10.902057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-kolla-config\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003474 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-secrets\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003685 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003721 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmlk\" (UniqueName: \"kubernetes.io/projected/33dac706-1170-45a3-8151-a6ee9bce8005-kube-api-access-bgmlk\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003775 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-config-data-default\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003870 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33dac706-1170-45a3-8151-a6ee9bce8005-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003915 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-kolla-config\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.003987 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.004155 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.004479 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33dac706-1170-45a3-8151-a6ee9bce8005-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.005235 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-kolla-config\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.005576 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-config-data-default\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.005927 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dac706-1170-45a3-8151-a6ee9bce8005-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.007906 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-secrets\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.009343 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.014862 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dac706-1170-45a3-8151-a6ee9bce8005-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.022519 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmlk\" (UniqueName: \"kubernetes.io/projected/33dac706-1170-45a3-8151-a6ee9bce8005-kube-api-access-bgmlk\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.031897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33dac706-1170-45a3-8151-a6ee9bce8005\") " pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.116102 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.125732 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.127513 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.133620 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qdvj5" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.133924 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.134211 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.134421 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.135296 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206577 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53fa562b-eb62-487e-8b82-3da0799fae19-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206628 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206687 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206760 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206791 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206858 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.206974 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zztk\" (UniqueName: \"kubernetes.io/projected/53fa562b-eb62-487e-8b82-3da0799fae19-kube-api-access-2zztk\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.207006 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.207028 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308475 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308551 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308613 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zztk\" (UniqueName: \"kubernetes.io/projected/53fa562b-eb62-487e-8b82-3da0799fae19-kube-api-access-2zztk\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308650 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308668 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308716 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53fa562b-eb62-487e-8b82-3da0799fae19-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308732 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308762 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.308782 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.309564 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.309755 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53fa562b-eb62-487e-8b82-3da0799fae19-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.309806 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.310616 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.310700 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53fa562b-eb62-487e-8b82-3da0799fae19-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.312110 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.318928 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.326780 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zztk\" (UniqueName: \"kubernetes.io/projected/53fa562b-eb62-487e-8b82-3da0799fae19-kube-api-access-2zztk\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.326938 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53fa562b-eb62-487e-8b82-3da0799fae19-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.337814 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53fa562b-eb62-487e-8b82-3da0799fae19\") " pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.448971 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.635656 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.636866 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.638976 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.639132 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bx6zb" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.639204 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.646478 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.714551 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb42938c-1da6-4d93-b7be-ff78d294ebf1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.714629 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb42938c-1da6-4d93-b7be-ff78d294ebf1-kolla-config\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.714721 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb42938c-1da6-4d93-b7be-ff78d294ebf1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.714786 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dql4q\" (UniqueName: \"kubernetes.io/projected/cb42938c-1da6-4d93-b7be-ff78d294ebf1-kube-api-access-dql4q\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.714842 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb42938c-1da6-4d93-b7be-ff78d294ebf1-config-data\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.816327 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb42938c-1da6-4d93-b7be-ff78d294ebf1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.816435 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb42938c-1da6-4d93-b7be-ff78d294ebf1-kolla-config\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.816474 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb42938c-1da6-4d93-b7be-ff78d294ebf1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.816743 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dql4q\" (UniqueName: \"kubernetes.io/projected/cb42938c-1da6-4d93-b7be-ff78d294ebf1-kube-api-access-dql4q\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.816834 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb42938c-1da6-4d93-b7be-ff78d294ebf1-config-data\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.817757 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb42938c-1da6-4d93-b7be-ff78d294ebf1-config-data\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.818372 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb42938c-1da6-4d93-b7be-ff78d294ebf1-kolla-config\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.821193 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb42938c-1da6-4d93-b7be-ff78d294ebf1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.827147 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb42938c-1da6-4d93-b7be-ff78d294ebf1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.836288 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dql4q\" (UniqueName: \"kubernetes.io/projected/cb42938c-1da6-4d93-b7be-ff78d294ebf1-kube-api-access-dql4q\") pod \"memcached-0\" (UID: \"cb42938c-1da6-4d93-b7be-ff78d294ebf1\") " pod="openstack/memcached-0" Oct 08 20:59:11 crc kubenswrapper[4669]: I1008 20:59:11.956341 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.185996 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.186655 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.231880 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.396280 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.397302 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.400281 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cc789" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.409052 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.448229 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krzg5\" (UniqueName: \"kubernetes.io/projected/1e47bce4-587b-4864-86ea-a1e2a7987779-kube-api-access-krzg5\") pod \"kube-state-metrics-0\" (UID: \"1e47bce4-587b-4864-86ea-a1e2a7987779\") " pod="openstack/kube-state-metrics-0" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.549327 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krzg5\" (UniqueName: \"kubernetes.io/projected/1e47bce4-587b-4864-86ea-a1e2a7987779-kube-api-access-krzg5\") pod \"kube-state-metrics-0\" (UID: \"1e47bce4-587b-4864-86ea-a1e2a7987779\") " pod="openstack/kube-state-metrics-0" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.573375 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krzg5\" (UniqueName: \"kubernetes.io/projected/1e47bce4-587b-4864-86ea-a1e2a7987779-kube-api-access-krzg5\") pod \"kube-state-metrics-0\" (UID: \"1e47bce4-587b-4864-86ea-a1e2a7987779\") " pod="openstack/kube-state-metrics-0" Oct 08 20:59:13 crc kubenswrapper[4669]: I1008 20:59:13.725463 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.985214 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mvd2"] Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.986352 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.987944 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.988639 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ds7wn" Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.988855 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.993423 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wnkk4"] Oct 08 20:59:16 crc kubenswrapper[4669]: I1008 20:59:16.994935 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.011731 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mvd2"] Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.028608 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wnkk4"] Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.104922 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-log\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.104965 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk78f\" (UniqueName: \"kubernetes.io/projected/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-kube-api-access-zk78f\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.104999 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-ovn-controller-tls-certs\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105013 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-combined-ca-bundle\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105185 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-run-ovn\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105266 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-etc-ovs\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105290 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-log-ovn\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105329 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-scripts\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105390 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-lib\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105470 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-scripts\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105489 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-run\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105517 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-run\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.105585 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzhh\" (UniqueName: \"kubernetes.io/projected/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-kube-api-access-dgzhh\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.206842 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-scripts\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209428 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-run\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-run\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209498 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzhh\" (UniqueName: \"kubernetes.io/projected/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-kube-api-access-dgzhh\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209548 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-log\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209562 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk78f\" (UniqueName: \"kubernetes.io/projected/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-kube-api-access-zk78f\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209590 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-ovn-controller-tls-certs\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209606 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-combined-ca-bundle\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209625 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-run-ovn\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209655 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-etc-ovs\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209671 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-log-ovn\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209695 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-scripts\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209727 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-lib\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.210142 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-lib\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.209392 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-scripts\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.210274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-run\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.210315 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-run\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.210676 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-run-ovn\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.210821 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-var-log\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.210949 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-var-log-ovn\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.211665 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-etc-ovs\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.214312 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-scripts\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.215511 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-ovn-controller-tls-certs\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.226772 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-combined-ca-bundle\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.231157 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk78f\" (UniqueName: \"kubernetes.io/projected/2e96f6f2-b5f3-49e7-8d84-15d5535963a2-kube-api-access-zk78f\") pod \"ovn-controller-2mvd2\" (UID: \"2e96f6f2-b5f3-49e7-8d84-15d5535963a2\") " pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.233540 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzhh\" (UniqueName: \"kubernetes.io/projected/d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc-kube-api-access-dgzhh\") pod \"ovn-controller-ovs-wnkk4\" (UID: \"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc\") " pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.315336 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.334456 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:17 crc kubenswrapper[4669]: I1008 20:59:17.469998 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.006477 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4f648df-77f7-4480-8b46-3f776880db17","Type":"ContainerStarted","Data":"9f3f60c96a2147ab91100ddcdad902c4ffab6669f24bc9e5d263269918f80e82"} Oct 08 20:59:18 crc kubenswrapper[4669]: E1008 20:59:18.332362 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 08 20:59:18 crc kubenswrapper[4669]: E1008 20:59:18.332565 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbzn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-wt5ln_openstack(0ab22cf7-c83f-4a32-abc1-b53715898e21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 20:59:18 crc kubenswrapper[4669]: E1008 20:59:18.333759 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" podUID="0ab22cf7-c83f-4a32-abc1-b53715898e21" Oct 08 20:59:18 crc kubenswrapper[4669]: E1008 20:59:18.352481 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 08 20:59:18 crc kubenswrapper[4669]: E1008 20:59:18.352652 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fwmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-jksdh_openstack(9229843b-95c5-4135-8383-7617689bd4ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 20:59:18 crc kubenswrapper[4669]: E1008 20:59:18.353850 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" podUID="9229843b-95c5-4135-8383-7617689bd4ad" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.740742 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9lfvd"] Oct 08 20:59:18 crc kubenswrapper[4669]: W1008 20:59:18.743505 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb60a2dd2_7028_42a7_8ef9_38099b6f8817.slice/crio-a316e4dad3c7e2e2bcca472ea76c94486f335e79f605040debcfb59d42306ab5 WatchSource:0}: Error finding container a316e4dad3c7e2e2bcca472ea76c94486f335e79f605040debcfb59d42306ab5: Status 404 returned error can't find the container with id a316e4dad3c7e2e2bcca472ea76c94486f335e79f605040debcfb59d42306ab5 Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.764353 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.770208 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.774770 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.777585 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.777818 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.778052 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.778175 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4tngv" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.782360 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836399 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836472 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjkk\" (UniqueName: \"kubernetes.io/projected/07bcdc9f-6970-49af-8620-63f8ed43845b-kube-api-access-ktjkk\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836516 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836585 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07bcdc9f-6970-49af-8620-63f8ed43845b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836639 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07bcdc9f-6970-49af-8620-63f8ed43845b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836664 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836716 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.836754 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07bcdc9f-6970-49af-8620-63f8ed43845b-config\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.856346 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 20:59:18 crc kubenswrapper[4669]: W1008 20:59:18.856867 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63d3545_a64d_4c9a_9198_bf11fc782cc6.slice/crio-ea510e3f21b6ae24cc778d1fc614b2526a2beb3039e7dcbadc25581d5d0b5656 WatchSource:0}: Error finding container ea510e3f21b6ae24cc778d1fc614b2526a2beb3039e7dcbadc25581d5d0b5656: Status 404 returned error can't find the container with id ea510e3f21b6ae24cc778d1fc614b2526a2beb3039e7dcbadc25581d5d0b5656 Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.864750 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nfnp7"] Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941743 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07bcdc9f-6970-49af-8620-63f8ed43845b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941785 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941828 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941853 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07bcdc9f-6970-49af-8620-63f8ed43845b-config\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941873 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941901 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjkk\" (UniqueName: \"kubernetes.io/projected/07bcdc9f-6970-49af-8620-63f8ed43845b-kube-api-access-ktjkk\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.941961 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07bcdc9f-6970-49af-8620-63f8ed43845b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.942402 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07bcdc9f-6970-49af-8620-63f8ed43845b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.943899 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.944049 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07bcdc9f-6970-49af-8620-63f8ed43845b-config\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.946120 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.946516 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07bcdc9f-6970-49af-8620-63f8ed43845b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.952782 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.953694 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.969274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bcdc9f-6970-49af-8620-63f8ed43845b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.980038 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjkk\" (UniqueName: \"kubernetes.io/projected/07bcdc9f-6970-49af-8620-63f8ed43845b-kube-api-access-ktjkk\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:18 crc kubenswrapper[4669]: I1008 20:59:18.985729 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"07bcdc9f-6970-49af-8620-63f8ed43845b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.027649 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb42938c-1da6-4d93-b7be-ff78d294ebf1","Type":"ContainerStarted","Data":"1310939a24df139b68b824b011ff160dad43bce8aa38ff1480b6c1261d32eb07"} Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.029452 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a63d3545-a64d-4c9a-9198-bf11fc782cc6","Type":"ContainerStarted","Data":"ea510e3f21b6ae24cc778d1fc614b2526a2beb3039e7dcbadc25581d5d0b5656"} Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.030453 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" event={"ID":"b60a2dd2-7028-42a7-8ef9-38099b6f8817","Type":"ContainerStarted","Data":"a316e4dad3c7e2e2bcca472ea76c94486f335e79f605040debcfb59d42306ab5"} Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.031866 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" event={"ID":"f476231d-6e83-47b6-9fa4-4714af712b0c","Type":"ContainerStarted","Data":"ab7e4a1d6e3889140728c8cfb251d30127caf71b83c2427e3c2890b98d3f2970"} Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.050242 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mvd2"] Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.061010 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.066479 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.071107 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wnkk4"] Oct 08 20:59:19 crc kubenswrapper[4669]: W1008 20:59:19.075872 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e47bce4_587b_4864_86ea_a1e2a7987779.slice/crio-48ddd5abea97bcb5f4d2c16fdb1a4d9b716a85e6a003098401c763ffd93c92d7 WatchSource:0}: Error finding container 48ddd5abea97bcb5f4d2c16fdb1a4d9b716a85e6a003098401c763ffd93c92d7: Status 404 returned error can't find the container with id 48ddd5abea97bcb5f4d2c16fdb1a4d9b716a85e6a003098401c763ffd93c92d7 Oct 08 20:59:19 crc kubenswrapper[4669]: W1008 20:59:19.085298 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b7a290_4c6b_48ff_b23b_7fe2ba609dcc.slice/crio-9e2c30ec628b2799064868124c6be1e4ad7ac65b4a86a9ad0822699c52fc9cdd WatchSource:0}: Error finding container 9e2c30ec628b2799064868124c6be1e4ad7ac65b4a86a9ad0822699c52fc9cdd: Status 404 returned error can't find the container with id 9e2c30ec628b2799064868124c6be1e4ad7ac65b4a86a9ad0822699c52fc9cdd Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.089247 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 20:59:19 crc kubenswrapper[4669]: W1008 20:59:19.095627 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53fa562b_eb62_487e_8b82_3da0799fae19.slice/crio-1e5af1e47f7b0c3641e82364f72af8b464739fe38cb4adcbb1059d435fdaf564 WatchSource:0}: Error finding container 1e5af1e47f7b0c3641e82364f72af8b464739fe38cb4adcbb1059d435fdaf564: Status 404 returned error can't find the container with id 1e5af1e47f7b0c3641e82364f72af8b464739fe38cb4adcbb1059d435fdaf564 Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.100451 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.384574 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.448309 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.448463 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-dns-svc\") pod \"9229843b-95c5-4135-8383-7617689bd4ad\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.448511 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwmc\" (UniqueName: \"kubernetes.io/projected/9229843b-95c5-4135-8383-7617689bd4ad-kube-api-access-9fwmc\") pod \"9229843b-95c5-4135-8383-7617689bd4ad\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.448599 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-config\") pod \"9229843b-95c5-4135-8383-7617689bd4ad\" (UID: \"9229843b-95c5-4135-8383-7617689bd4ad\") " Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.449160 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-config" (OuterVolumeSpecName: "config") pod "9229843b-95c5-4135-8383-7617689bd4ad" (UID: "9229843b-95c5-4135-8383-7617689bd4ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.449167 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9229843b-95c5-4135-8383-7617689bd4ad" (UID: "9229843b-95c5-4135-8383-7617689bd4ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.453203 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9229843b-95c5-4135-8383-7617689bd4ad-kube-api-access-9fwmc" (OuterVolumeSpecName: "kube-api-access-9fwmc") pod "9229843b-95c5-4135-8383-7617689bd4ad" (UID: "9229843b-95c5-4135-8383-7617689bd4ad"). InnerVolumeSpecName "kube-api-access-9fwmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.550422 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbzn7\" (UniqueName: \"kubernetes.io/projected/0ab22cf7-c83f-4a32-abc1-b53715898e21-kube-api-access-pbzn7\") pod \"0ab22cf7-c83f-4a32-abc1-b53715898e21\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.550617 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab22cf7-c83f-4a32-abc1-b53715898e21-config\") pod \"0ab22cf7-c83f-4a32-abc1-b53715898e21\" (UID: \"0ab22cf7-c83f-4a32-abc1-b53715898e21\") " Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.550979 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.550994 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9229843b-95c5-4135-8383-7617689bd4ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.551007 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwmc\" (UniqueName: \"kubernetes.io/projected/9229843b-95c5-4135-8383-7617689bd4ad-kube-api-access-9fwmc\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.551127 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab22cf7-c83f-4a32-abc1-b53715898e21-config" (OuterVolumeSpecName: "config") pod "0ab22cf7-c83f-4a32-abc1-b53715898e21" (UID: "0ab22cf7-c83f-4a32-abc1-b53715898e21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.555106 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab22cf7-c83f-4a32-abc1-b53715898e21-kube-api-access-pbzn7" (OuterVolumeSpecName: "kube-api-access-pbzn7") pod "0ab22cf7-c83f-4a32-abc1-b53715898e21" (UID: "0ab22cf7-c83f-4a32-abc1-b53715898e21"). InnerVolumeSpecName "kube-api-access-pbzn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.652851 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab22cf7-c83f-4a32-abc1-b53715898e21-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.652888 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbzn7\" (UniqueName: \"kubernetes.io/projected/0ab22cf7-c83f-4a32-abc1-b53715898e21-kube-api-access-pbzn7\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.671615 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.911105 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l4snx"] Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.912235 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.915521 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.954225 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l4snx"] Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.958843 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc4b231-fe5a-4712-855f-3d56029e240b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.958918 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2bc4b231-fe5a-4712-855f-3d56029e240b-ovn-rundir\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.959003 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2bc4b231-fe5a-4712-855f-3d56029e240b-ovs-rundir\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.959059 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc4b231-fe5a-4712-855f-3d56029e240b-combined-ca-bundle\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.959130 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk29p\" (UniqueName: \"kubernetes.io/projected/2bc4b231-fe5a-4712-855f-3d56029e240b-kube-api-access-rk29p\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:19 crc kubenswrapper[4669]: I1008 20:59:19.959182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc4b231-fe5a-4712-855f-3d56029e240b-config\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.059137 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" event={"ID":"0ab22cf7-c83f-4a32-abc1-b53715898e21","Type":"ContainerDied","Data":"855de2f75c497b09624143ad062c87428033a0cf8db944564aa250e19969ba31"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.059252 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wt5ln" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060116 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2bc4b231-fe5a-4712-855f-3d56029e240b-ovs-rundir\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060173 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc4b231-fe5a-4712-855f-3d56029e240b-combined-ca-bundle\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060199 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk29p\" (UniqueName: \"kubernetes.io/projected/2bc4b231-fe5a-4712-855f-3d56029e240b-kube-api-access-rk29p\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060238 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc4b231-fe5a-4712-855f-3d56029e240b-config\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060284 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc4b231-fe5a-4712-855f-3d56029e240b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060314 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2bc4b231-fe5a-4712-855f-3d56029e240b-ovn-rundir\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.060679 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2bc4b231-fe5a-4712-855f-3d56029e240b-ovn-rundir\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.064842 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2bc4b231-fe5a-4712-855f-3d56029e240b-ovs-rundir\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.066019 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc4b231-fe5a-4712-855f-3d56029e240b-config\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.076505 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc4b231-fe5a-4712-855f-3d56029e240b-combined-ca-bundle\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.079904 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc4b231-fe5a-4712-855f-3d56029e240b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.084374 4669 generic.go:334] "Generic (PLEG): container finished" podID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerID="88b3c6fbf0c3c8a46df051f8123f25ba55c6300188ce4a9b648e554445b4a09b" exitCode=0 Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.084462 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" event={"ID":"b60a2dd2-7028-42a7-8ef9-38099b6f8817","Type":"ContainerDied","Data":"88b3c6fbf0c3c8a46df051f8123f25ba55c6300188ce4a9b648e554445b4a09b"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.093123 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk29p\" (UniqueName: \"kubernetes.io/projected/2bc4b231-fe5a-4712-855f-3d56029e240b-kube-api-access-rk29p\") pod \"ovn-controller-metrics-l4snx\" (UID: \"2bc4b231-fe5a-4712-855f-3d56029e240b\") " pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.102860 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"07bcdc9f-6970-49af-8620-63f8ed43845b","Type":"ContainerStarted","Data":"4b2b18f8f85df11c59f9c08e83b464f8b779c4f122f1e659bb25d407576e925b"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.124488 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53fa562b-eb62-487e-8b82-3da0799fae19","Type":"ContainerStarted","Data":"1e5af1e47f7b0c3641e82364f72af8b464739fe38cb4adcbb1059d435fdaf564"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.135777 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" event={"ID":"9229843b-95c5-4135-8383-7617689bd4ad","Type":"ContainerDied","Data":"89a081775d908810d9b37a922af65962e1f558e4b2ccf7e5ecc43e762906878a"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.135792 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jksdh" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.142093 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e47bce4-587b-4864-86ea-a1e2a7987779","Type":"ContainerStarted","Data":"48ddd5abea97bcb5f4d2c16fdb1a4d9b716a85e6a003098401c763ffd93c92d7"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.157404 4669 generic.go:334] "Generic (PLEG): container finished" podID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerID="cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf" exitCode=0 Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.157486 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" event={"ID":"f476231d-6e83-47b6-9fa4-4714af712b0c","Type":"ContainerDied","Data":"cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.171615 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33dac706-1170-45a3-8151-a6ee9bce8005","Type":"ContainerStarted","Data":"d40d954dfc450c41050e2aa5a4d523e78b711a253bd67273178c8c7d3cf9304d"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.183683 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2" event={"ID":"2e96f6f2-b5f3-49e7-8d84-15d5535963a2","Type":"ContainerStarted","Data":"685b1de48ff9ec5be7637c86c8e2ba1263fb9b04fbecdde865e1ef28df0f80d9"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.197561 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wnkk4" event={"ID":"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc","Type":"ContainerStarted","Data":"9e2c30ec628b2799064868124c6be1e4ad7ac65b4a86a9ad0822699c52fc9cdd"} Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.203712 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wt5ln"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.240160 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wt5ln"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.262098 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l4snx" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.288268 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jksdh"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.301043 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jksdh"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.350762 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9lfvd"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.363830 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgh5"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.365202 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.368684 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.371951 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgh5"] Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.473897 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-config\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.473992 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.474165 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.474210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmbkg\" (UniqueName: \"kubernetes.io/projected/2032cb15-6b45-4eca-8ba6-ae485a95a244-kube-api-access-nmbkg\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: E1008 20:59:20.530564 4669 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 08 20:59:20 crc kubenswrapper[4669]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f476231d-6e83-47b6-9fa4-4714af712b0c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 08 20:59:20 crc kubenswrapper[4669]: > podSandboxID="ab7e4a1d6e3889140728c8cfb251d30127caf71b83c2427e3c2890b98d3f2970" Oct 08 20:59:20 crc kubenswrapper[4669]: E1008 20:59:20.531461 4669 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 08 20:59:20 crc kubenswrapper[4669]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-crrgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-nfnp7_openstack(f476231d-6e83-47b6-9fa4-4714af712b0c): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f476231d-6e83-47b6-9fa4-4714af712b0c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 08 20:59:20 crc kubenswrapper[4669]: > logger="UnhandledError" Oct 08 20:59:20 crc kubenswrapper[4669]: E1008 20:59:20.533408 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f476231d-6e83-47b6-9fa4-4714af712b0c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.575341 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.575434 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.575512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmbkg\" (UniqueName: \"kubernetes.io/projected/2032cb15-6b45-4eca-8ba6-ae485a95a244-kube-api-access-nmbkg\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.575624 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-config\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.577568 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-config\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.578898 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.579499 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.613129 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmbkg\" (UniqueName: \"kubernetes.io/projected/2032cb15-6b45-4eca-8ba6-ae485a95a244-kube-api-access-nmbkg\") pod \"dnsmasq-dns-6bc7876d45-5cgh5\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.735111 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:20 crc kubenswrapper[4669]: I1008 20:59:20.854049 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l4snx"] Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.105609 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.107206 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.109395 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.109393 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.109449 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-mbgts" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.110696 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.116957 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.211205 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerName="dnsmasq-dns" containerID="cri-o://c692662c852fdc96a22525d90a54f4c9658b9b0948b31cb34b7af6b3b8d42eb0" gracePeriod=10 Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.211435 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" event={"ID":"b60a2dd2-7028-42a7-8ef9-38099b6f8817","Type":"ContainerStarted","Data":"c692662c852fdc96a22525d90a54f4c9658b9b0948b31cb34b7af6b3b8d42eb0"} Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.211565 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:21 crc kubenswrapper[4669]: W1008 20:59:21.232585 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bc4b231_fe5a_4712_855f_3d56029e240b.slice/crio-de56d09c09f8197d329a4c1647511eaff37c937df2c843d826aec08ea241d81c WatchSource:0}: Error finding container de56d09c09f8197d329a4c1647511eaff37c937df2c843d826aec08ea241d81c: Status 404 returned error can't find the container with id de56d09c09f8197d329a4c1647511eaff37c937df2c843d826aec08ea241d81c Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288004 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7ml\" (UniqueName: \"kubernetes.io/projected/70b1241c-d97a-4af6-9c95-dadad197012e-kube-api-access-9q7ml\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288084 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70b1241c-d97a-4af6-9c95-dadad197012e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288117 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1241c-d97a-4af6-9c95-dadad197012e-config\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288146 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/70b1241c-d97a-4af6-9c95-dadad197012e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288171 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288215 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288256 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.288285 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.345259 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab22cf7-c83f-4a32-abc1-b53715898e21" path="/var/lib/kubelet/pods/0ab22cf7-c83f-4a32-abc1-b53715898e21/volumes" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.345734 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9229843b-95c5-4135-8383-7617689bd4ad" path="/var/lib/kubelet/pods/9229843b-95c5-4135-8383-7617689bd4ad/volumes" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.357960 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" podStartSLOduration=14.624671322 podStartE2EDuration="15.357941871s" podCreationTimestamp="2025-10-08 20:59:06 +0000 UTC" firstStartedPulling="2025-10-08 20:59:18.745994476 +0000 UTC m=+878.438805149" lastFinishedPulling="2025-10-08 20:59:19.479265025 +0000 UTC m=+879.172075698" observedRunningTime="2025-10-08 20:59:21.247630019 +0000 UTC m=+880.940440702" watchObservedRunningTime="2025-10-08 20:59:21.357941871 +0000 UTC m=+881.050752544" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.390618 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/70b1241c-d97a-4af6-9c95-dadad197012e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.390702 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.390739 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.390785 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.390814 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.391205 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.391281 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/70b1241c-d97a-4af6-9c95-dadad197012e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.391382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7ml\" (UniqueName: \"kubernetes.io/projected/70b1241c-d97a-4af6-9c95-dadad197012e-kube-api-access-9q7ml\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.391606 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70b1241c-d97a-4af6-9c95-dadad197012e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.391639 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1241c-d97a-4af6-9c95-dadad197012e-config\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.392443 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1241c-d97a-4af6-9c95-dadad197012e-config\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.393100 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70b1241c-d97a-4af6-9c95-dadad197012e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.397846 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.400248 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.400904 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b1241c-d97a-4af6-9c95-dadad197012e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.415107 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7ml\" (UniqueName: \"kubernetes.io/projected/70b1241c-d97a-4af6-9c95-dadad197012e-kube-api-access-9q7ml\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.426925 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"70b1241c-d97a-4af6-9c95-dadad197012e\") " pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:21 crc kubenswrapper[4669]: I1008 20:59:21.731489 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:22 crc kubenswrapper[4669]: I1008 20:59:22.222990 4669 generic.go:334] "Generic (PLEG): container finished" podID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerID="c692662c852fdc96a22525d90a54f4c9658b9b0948b31cb34b7af6b3b8d42eb0" exitCode=0 Oct 08 20:59:22 crc kubenswrapper[4669]: I1008 20:59:22.223066 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" event={"ID":"b60a2dd2-7028-42a7-8ef9-38099b6f8817","Type":"ContainerDied","Data":"c692662c852fdc96a22525d90a54f4c9658b9b0948b31cb34b7af6b3b8d42eb0"} Oct 08 20:59:22 crc kubenswrapper[4669]: I1008 20:59:22.224366 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l4snx" event={"ID":"2bc4b231-fe5a-4712-855f-3d56029e240b","Type":"ContainerStarted","Data":"de56d09c09f8197d329a4c1647511eaff37c937df2c843d826aec08ea241d81c"} Oct 08 20:59:26 crc kubenswrapper[4669]: I1008 20:59:26.542149 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgh5"] Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.275354 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" event={"ID":"b60a2dd2-7028-42a7-8ef9-38099b6f8817","Type":"ContainerDied","Data":"a316e4dad3c7e2e2bcca472ea76c94486f335e79f605040debcfb59d42306ab5"} Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.275842 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a316e4dad3c7e2e2bcca472ea76c94486f335e79f605040debcfb59d42306ab5" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.296385 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.397811 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-config\") pod \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.398011 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-dns-svc\") pod \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.398566 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpnth\" (UniqueName: \"kubernetes.io/projected/b60a2dd2-7028-42a7-8ef9-38099b6f8817-kube-api-access-qpnth\") pod \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\" (UID: \"b60a2dd2-7028-42a7-8ef9-38099b6f8817\") " Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.406423 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60a2dd2-7028-42a7-8ef9-38099b6f8817-kube-api-access-qpnth" (OuterVolumeSpecName: "kube-api-access-qpnth") pod "b60a2dd2-7028-42a7-8ef9-38099b6f8817" (UID: "b60a2dd2-7028-42a7-8ef9-38099b6f8817"). InnerVolumeSpecName "kube-api-access-qpnth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.449034 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-config" (OuterVolumeSpecName: "config") pod "b60a2dd2-7028-42a7-8ef9-38099b6f8817" (UID: "b60a2dd2-7028-42a7-8ef9-38099b6f8817"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.451469 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b60a2dd2-7028-42a7-8ef9-38099b6f8817" (UID: "b60a2dd2-7028-42a7-8ef9-38099b6f8817"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.500556 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpnth\" (UniqueName: \"kubernetes.io/projected/b60a2dd2-7028-42a7-8ef9-38099b6f8817-kube-api-access-qpnth\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.500592 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:27 crc kubenswrapper[4669]: I1008 20:59:27.500603 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b60a2dd2-7028-42a7-8ef9-38099b6f8817-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:28 crc kubenswrapper[4669]: I1008 20:59:28.288156 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" event={"ID":"2032cb15-6b45-4eca-8ba6-ae485a95a244","Type":"ContainerStarted","Data":"76a507a5084cf8a057fe259b78ebc89f648c87c64fafa22ebdef93beba4f21cc"} Oct 08 20:59:28 crc kubenswrapper[4669]: I1008 20:59:28.288190 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9lfvd" Oct 08 20:59:28 crc kubenswrapper[4669]: I1008 20:59:28.343322 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9lfvd"] Oct 08 20:59:28 crc kubenswrapper[4669]: I1008 20:59:28.348021 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9lfvd"] Oct 08 20:59:29 crc kubenswrapper[4669]: I1008 20:59:29.250611 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 20:59:29 crc kubenswrapper[4669]: I1008 20:59:29.340189 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" path="/var/lib/kubelet/pods/b60a2dd2-7028-42a7-8ef9-38099b6f8817/volumes" Oct 08 20:59:30 crc kubenswrapper[4669]: I1008 20:59:30.309786 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"70b1241c-d97a-4af6-9c95-dadad197012e","Type":"ContainerStarted","Data":"a0a3438f096b52e54ffa6ad9b0849179c18d24e8030be170adf218519ea9a915"} Oct 08 20:59:33 crc kubenswrapper[4669]: E1008 20:59:33.337911 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 08 20:59:33 crc kubenswrapper[4669]: E1008 20:59:33.338251 4669 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 08 20:59:33 crc kubenswrapper[4669]: E1008 20:59:33.338388 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krzg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(1e47bce4-587b-4864-86ea-a1e2a7987779): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 20:59:33 crc kubenswrapper[4669]: E1008 20:59:33.339479 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" Oct 08 20:59:34 crc kubenswrapper[4669]: I1008 20:59:34.370472 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" event={"ID":"f476231d-6e83-47b6-9fa4-4714af712b0c","Type":"ContainerStarted","Data":"2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1"} Oct 08 20:59:34 crc kubenswrapper[4669]: I1008 20:59:34.370921 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:34 crc kubenswrapper[4669]: I1008 20:59:34.374241 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb42938c-1da6-4d93-b7be-ff78d294ebf1","Type":"ContainerStarted","Data":"0c5b1967b76f5d67eed1c02316ba79f57df183ca56a33f318b145e6c06561364"} Oct 08 20:59:34 crc kubenswrapper[4669]: I1008 20:59:34.374408 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 20:59:34 crc kubenswrapper[4669]: E1008 20:59:34.376071 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" Oct 08 20:59:34 crc kubenswrapper[4669]: I1008 20:59:34.399768 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" podStartSLOduration=27.719095291 podStartE2EDuration="28.399711572s" podCreationTimestamp="2025-10-08 20:59:06 +0000 UTC" firstStartedPulling="2025-10-08 20:59:18.860829583 +0000 UTC m=+878.553640256" lastFinishedPulling="2025-10-08 20:59:19.541445864 +0000 UTC m=+879.234256537" observedRunningTime="2025-10-08 20:59:34.397183193 +0000 UTC m=+894.089993936" watchObservedRunningTime="2025-10-08 20:59:34.399711572 +0000 UTC m=+894.092522255" Oct 08 20:59:34 crc kubenswrapper[4669]: I1008 20:59:34.425350 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.594290792 podStartE2EDuration="23.425325796s" podCreationTimestamp="2025-10-08 20:59:11 +0000 UTC" firstStartedPulling="2025-10-08 20:59:18.986152019 +0000 UTC m=+878.678962692" lastFinishedPulling="2025-10-08 20:59:27.817187013 +0000 UTC m=+887.509997696" observedRunningTime="2025-10-08 20:59:34.417957853 +0000 UTC m=+894.110768546" watchObservedRunningTime="2025-10-08 20:59:34.425325796 +0000 UTC m=+894.118136479" Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.382369 4669 generic.go:334] "Generic (PLEG): container finished" podID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerID="746cffce80af09098020721cf59b702be337803c96c40fd6019c7160356728d1" exitCode=0 Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.382428 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" event={"ID":"2032cb15-6b45-4eca-8ba6-ae485a95a244","Type":"ContainerDied","Data":"746cffce80af09098020721cf59b702be337803c96c40fd6019c7160356728d1"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.384652 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33dac706-1170-45a3-8151-a6ee9bce8005","Type":"ContainerStarted","Data":"043ce444338f037471ba1459b3bb0bb2d1d59a5079caf9fc3b1ac81599c388bb"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.386874 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a63d3545-a64d-4c9a-9198-bf11fc782cc6","Type":"ContainerStarted","Data":"75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.388302 4669 generic.go:334] "Generic (PLEG): container finished" podID="d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc" containerID="e1138ab5aa5f14ce68fb431c631eef5bc8f2c906940dd350b38088575cead85c" exitCode=0 Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.388361 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wnkk4" event={"ID":"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc","Type":"ContainerDied","Data":"e1138ab5aa5f14ce68fb431c631eef5bc8f2c906940dd350b38088575cead85c"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.390267 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"70b1241c-d97a-4af6-9c95-dadad197012e","Type":"ContainerStarted","Data":"7979bdc4f9d088d5804b3146b8c76492b3697fb3866f398d950f3bb7c09736cf"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.390302 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"70b1241c-d97a-4af6-9c95-dadad197012e","Type":"ContainerStarted","Data":"73bd24b29c57099ccb831c82263e45309a52724204ec1dfd5dd0e1677136520e"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.392202 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"07bcdc9f-6970-49af-8620-63f8ed43845b","Type":"ContainerStarted","Data":"6297ac41611d8bc42d3774c1462102d53c8b361c005960d7fa25ccb4082cc468"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.392781 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"07bcdc9f-6970-49af-8620-63f8ed43845b","Type":"ContainerStarted","Data":"04340b1ffcf9e5f258904e4c4e8563e9311f51e7d3e81e31813d04323507d631"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.395692 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53fa562b-eb62-487e-8b82-3da0799fae19","Type":"ContainerStarted","Data":"6653c1d3072e09074b40aefb4ef4dfd89410cf77e3c6890e952576e17366bfeb"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.407556 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4f648df-77f7-4480-8b46-3f776880db17","Type":"ContainerStarted","Data":"cd3fd70fc2d5d3707c6cba8aa9868955a85e808c3f7e26a6168a2bbcffde2d37"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.410143 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2" event={"ID":"2e96f6f2-b5f3-49e7-8d84-15d5535963a2","Type":"ContainerStarted","Data":"c0ae2ab43e6bee047842bffd18ab4412af0b3216ceeedbdb87f7b7f6d4cbbbbf"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.410724 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2mvd2" Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.412589 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l4snx" event={"ID":"2bc4b231-fe5a-4712-855f-3d56029e240b","Type":"ContainerStarted","Data":"b3aeaf7d7efa1f206a0a526919439d389184621142750a86455228b09bdc5585"} Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.538978 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.411967838 podStartE2EDuration="18.53895663s" podCreationTimestamp="2025-10-08 20:59:17 +0000 UTC" firstStartedPulling="2025-10-08 20:59:19.789688629 +0000 UTC m=+879.482499292" lastFinishedPulling="2025-10-08 20:59:30.916677361 +0000 UTC m=+890.609488084" observedRunningTime="2025-10-08 20:59:35.527419283 +0000 UTC m=+895.220229956" watchObservedRunningTime="2025-10-08 20:59:35.53895663 +0000 UTC m=+895.231767313" Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.569393 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2mvd2" podStartSLOduration=5.32589472 podStartE2EDuration="19.569374547s" podCreationTimestamp="2025-10-08 20:59:16 +0000 UTC" firstStartedPulling="2025-10-08 20:59:19.075930767 +0000 UTC m=+878.768741440" lastFinishedPulling="2025-10-08 20:59:33.319410594 +0000 UTC m=+893.012221267" observedRunningTime="2025-10-08 20:59:35.565135541 +0000 UTC m=+895.257946214" watchObservedRunningTime="2025-10-08 20:59:35.569374547 +0000 UTC m=+895.262185220" Oct 08 20:59:35 crc kubenswrapper[4669]: I1008 20:59:35.585035 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l4snx" podStartSLOduration=4.18610019 podStartE2EDuration="16.584989697s" podCreationTimestamp="2025-10-08 20:59:19 +0000 UTC" firstStartedPulling="2025-10-08 20:59:21.234467997 +0000 UTC m=+880.927278670" lastFinishedPulling="2025-10-08 20:59:33.633357504 +0000 UTC m=+893.326168177" observedRunningTime="2025-10-08 20:59:35.583448024 +0000 UTC m=+895.276258697" watchObservedRunningTime="2025-10-08 20:59:35.584989697 +0000 UTC m=+895.277800370" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.058311 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nfnp7"] Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.092546 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-zvpft"] Oct 08 20:59:36 crc kubenswrapper[4669]: E1008 20:59:36.092940 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerName="dnsmasq-dns" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.092958 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerName="dnsmasq-dns" Oct 08 20:59:36 crc kubenswrapper[4669]: E1008 20:59:36.092986 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerName="init" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.092994 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerName="init" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.093177 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60a2dd2-7028-42a7-8ef9-38099b6f8817" containerName="dnsmasq-dns" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.094187 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.096645 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.102571 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zvpft"] Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.150183 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.150266 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.150310 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-dns-svc\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.150501 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpkjg\" (UniqueName: \"kubernetes.io/projected/46627f9b-9655-4aa9-9f77-d387bbaddb80-kube-api-access-mpkjg\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.150556 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-config\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.251868 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.252245 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.252293 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-dns-svc\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.252347 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpkjg\" (UniqueName: \"kubernetes.io/projected/46627f9b-9655-4aa9-9f77-d387bbaddb80-kube-api-access-mpkjg\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.252371 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-config\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.253206 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.253208 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.253310 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-dns-svc\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.253391 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-config\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.273175 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpkjg\" (UniqueName: \"kubernetes.io/projected/46627f9b-9655-4aa9-9f77-d387bbaddb80-kube-api-access-mpkjg\") pod \"dnsmasq-dns-8554648995-zvpft\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.417656 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.423114 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wnkk4" event={"ID":"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc","Type":"ContainerStarted","Data":"f9243da7f40a4d5f1125246ffbe3ee74ddab53fa5701d190781d55e9b705d198"} Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.423149 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wnkk4" event={"ID":"d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc","Type":"ContainerStarted","Data":"b603f505f71141480061dce18b5aa974eb36a72148d7c285177be9a371e8ca8c"} Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.423321 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.423402 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.425973 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" event={"ID":"2032cb15-6b45-4eca-8ba6-ae485a95a244","Type":"ContainerStarted","Data":"7d8074bc0f9a97bc13e593a7e8b74a6b52399d7d4d7812866385710fa4922730"} Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.430347 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerName="dnsmasq-dns" containerID="cri-o://2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1" gracePeriod=10 Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.475824 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wnkk4" podStartSLOduration=10.907645197 podStartE2EDuration="20.475805016s" podCreationTimestamp="2025-10-08 20:59:16 +0000 UTC" firstStartedPulling="2025-10-08 20:59:19.089630383 +0000 UTC m=+878.782441056" lastFinishedPulling="2025-10-08 20:59:28.657790212 +0000 UTC m=+888.350600875" observedRunningTime="2025-10-08 20:59:36.463967171 +0000 UTC m=+896.156777864" watchObservedRunningTime="2025-10-08 20:59:36.475805016 +0000 UTC m=+896.168615689" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.494127 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" podStartSLOduration=16.494110189 podStartE2EDuration="16.494110189s" podCreationTimestamp="2025-10-08 20:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:59:36.489816321 +0000 UTC m=+896.182626984" watchObservedRunningTime="2025-10-08 20:59:36.494110189 +0000 UTC m=+896.186920862" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.735492 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.735869 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.907031 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.643433252 podStartE2EDuration="16.907013451s" podCreationTimestamp="2025-10-08 20:59:20 +0000 UTC" firstStartedPulling="2025-10-08 20:59:29.526276777 +0000 UTC m=+889.219087480" lastFinishedPulling="2025-10-08 20:59:33.789857006 +0000 UTC m=+893.482667679" observedRunningTime="2025-10-08 20:59:36.512253568 +0000 UTC m=+896.205064251" watchObservedRunningTime="2025-10-08 20:59:36.907013451 +0000 UTC m=+896.599824114" Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.908362 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zvpft"] Oct 08 20:59:36 crc kubenswrapper[4669]: W1008 20:59:36.911881 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46627f9b_9655_4aa9_9f77_d387bbaddb80.slice/crio-591403def94c87c8e44871971154795b836acde7c1c01bdc0c3aaaebca1a68e7 WatchSource:0}: Error finding container 591403def94c87c8e44871971154795b836acde7c1c01bdc0c3aaaebca1a68e7: Status 404 returned error can't find the container with id 591403def94c87c8e44871971154795b836acde7c1c01bdc0c3aaaebca1a68e7 Oct 08 20:59:36 crc kubenswrapper[4669]: I1008 20:59:36.921365 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.071224 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crrgz\" (UniqueName: \"kubernetes.io/projected/f476231d-6e83-47b6-9fa4-4714af712b0c-kube-api-access-crrgz\") pod \"f476231d-6e83-47b6-9fa4-4714af712b0c\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.071748 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-dns-svc\") pod \"f476231d-6e83-47b6-9fa4-4714af712b0c\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.071852 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-config\") pod \"f476231d-6e83-47b6-9fa4-4714af712b0c\" (UID: \"f476231d-6e83-47b6-9fa4-4714af712b0c\") " Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.076047 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f476231d-6e83-47b6-9fa4-4714af712b0c-kube-api-access-crrgz" (OuterVolumeSpecName: "kube-api-access-crrgz") pod "f476231d-6e83-47b6-9fa4-4714af712b0c" (UID: "f476231d-6e83-47b6-9fa4-4714af712b0c"). InnerVolumeSpecName "kube-api-access-crrgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.102514 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.114127 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f476231d-6e83-47b6-9fa4-4714af712b0c" (UID: "f476231d-6e83-47b6-9fa4-4714af712b0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.117550 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-config" (OuterVolumeSpecName: "config") pod "f476231d-6e83-47b6-9fa4-4714af712b0c" (UID: "f476231d-6e83-47b6-9fa4-4714af712b0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.147908 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.173982 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.174025 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crrgz\" (UniqueName: \"kubernetes.io/projected/f476231d-6e83-47b6-9fa4-4714af712b0c-kube-api-access-crrgz\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.174040 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f476231d-6e83-47b6-9fa4-4714af712b0c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.435008 4669 generic.go:334] "Generic (PLEG): container finished" podID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerID="b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4" exitCode=0 Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.435094 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zvpft" event={"ID":"46627f9b-9655-4aa9-9f77-d387bbaddb80","Type":"ContainerDied","Data":"b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4"} Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.435513 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zvpft" event={"ID":"46627f9b-9655-4aa9-9f77-d387bbaddb80","Type":"ContainerStarted","Data":"591403def94c87c8e44871971154795b836acde7c1c01bdc0c3aaaebca1a68e7"} Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.438457 4669 generic.go:334] "Generic (PLEG): container finished" podID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerID="2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1" exitCode=0 Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.438517 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.438616 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" event={"ID":"f476231d-6e83-47b6-9fa4-4714af712b0c","Type":"ContainerDied","Data":"2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1"} Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.438654 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nfnp7" event={"ID":"f476231d-6e83-47b6-9fa4-4714af712b0c","Type":"ContainerDied","Data":"ab7e4a1d6e3889140728c8cfb251d30127caf71b83c2427e3c2890b98d3f2970"} Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.438681 4669 scope.go:117] "RemoveContainer" containerID="2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.439724 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.439765 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.571966 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nfnp7"] Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.576325 4669 scope.go:117] "RemoveContainer" containerID="cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.583726 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nfnp7"] Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.595562 4669 scope.go:117] "RemoveContainer" containerID="2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1" Oct 08 20:59:37 crc kubenswrapper[4669]: E1008 20:59:37.596200 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1\": container with ID starting with 2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1 not found: ID does not exist" containerID="2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.596252 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1"} err="failed to get container status \"2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1\": rpc error: code = NotFound desc = could not find container \"2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1\": container with ID starting with 2a27b8d7ff338f3937d827fcec80d41c971e2d26a178085ea55139046fc14df1 not found: ID does not exist" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.596290 4669 scope.go:117] "RemoveContainer" containerID="cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf" Oct 08 20:59:37 crc kubenswrapper[4669]: E1008 20:59:37.596689 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf\": container with ID starting with cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf not found: ID does not exist" containerID="cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf" Oct 08 20:59:37 crc kubenswrapper[4669]: I1008 20:59:37.596717 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf"} err="failed to get container status \"cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf\": rpc error: code = NotFound desc = could not find container \"cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf\": container with ID starting with cdac45d16a2827604a6475332032ea2893e79804f8116aeee4188c2489bb56cf not found: ID does not exist" Oct 08 20:59:38 crc kubenswrapper[4669]: I1008 20:59:38.454781 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zvpft" event={"ID":"46627f9b-9655-4aa9-9f77-d387bbaddb80","Type":"ContainerStarted","Data":"0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3"} Oct 08 20:59:38 crc kubenswrapper[4669]: I1008 20:59:38.455231 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:38 crc kubenswrapper[4669]: I1008 20:59:38.487120 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-zvpft" podStartSLOduration=2.487094708 podStartE2EDuration="2.487094708s" podCreationTimestamp="2025-10-08 20:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:59:38.482326237 +0000 UTC m=+898.175137010" watchObservedRunningTime="2025-10-08 20:59:38.487094708 +0000 UTC m=+898.179905421" Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.166450 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.348505 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" path="/var/lib/kubelet/pods/f476231d-6e83-47b6-9fa4-4714af712b0c/volumes" Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.474246 4669 generic.go:334] "Generic (PLEG): container finished" podID="33dac706-1170-45a3-8151-a6ee9bce8005" containerID="043ce444338f037471ba1459b3bb0bb2d1d59a5079caf9fc3b1ac81599c388bb" exitCode=0 Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.474312 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33dac706-1170-45a3-8151-a6ee9bce8005","Type":"ContainerDied","Data":"043ce444338f037471ba1459b3bb0bb2d1d59a5079caf9fc3b1ac81599c388bb"} Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.478413 4669 generic.go:334] "Generic (PLEG): container finished" podID="53fa562b-eb62-487e-8b82-3da0799fae19" containerID="6653c1d3072e09074b40aefb4ef4dfd89410cf77e3c6890e952576e17366bfeb" exitCode=0 Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.478961 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53fa562b-eb62-487e-8b82-3da0799fae19","Type":"ContainerDied","Data":"6653c1d3072e09074b40aefb4ef4dfd89410cf77e3c6890e952576e17366bfeb"} Oct 08 20:59:39 crc kubenswrapper[4669]: I1008 20:59:39.770996 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:40 crc kubenswrapper[4669]: I1008 20:59:40.487742 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53fa562b-eb62-487e-8b82-3da0799fae19","Type":"ContainerStarted","Data":"784eabcb4f7eb0ef64dd701073c01f97891e66d4804608b5db42b7f229e362cf"} Oct 08 20:59:40 crc kubenswrapper[4669]: I1008 20:59:40.489599 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33dac706-1170-45a3-8151-a6ee9bce8005","Type":"ContainerStarted","Data":"063ee05b17efc5156a2b5728975ecbcf6c59ca1976ba00d7ed0fd9a17d8744ff"} Oct 08 20:59:40 crc kubenswrapper[4669]: I1008 20:59:40.531190 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.890984304 podStartE2EDuration="30.531163122s" podCreationTimestamp="2025-10-08 20:59:10 +0000 UTC" firstStartedPulling="2025-10-08 20:59:19.100819881 +0000 UTC m=+878.793630554" lastFinishedPulling="2025-10-08 20:59:28.740998679 +0000 UTC m=+888.433809372" observedRunningTime="2025-10-08 20:59:40.520115778 +0000 UTC m=+900.212926491" watchObservedRunningTime="2025-10-08 20:59:40.531163122 +0000 UTC m=+900.223973835" Oct 08 20:59:40 crc kubenswrapper[4669]: I1008 20:59:40.556699 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.845866071 podStartE2EDuration="31.556674083s" podCreationTimestamp="2025-10-08 20:59:09 +0000 UTC" firstStartedPulling="2025-10-08 20:59:19.089148839 +0000 UTC m=+878.781959512" lastFinishedPulling="2025-10-08 20:59:29.799956841 +0000 UTC m=+889.492767524" observedRunningTime="2025-10-08 20:59:40.553183136 +0000 UTC m=+900.245993879" watchObservedRunningTime="2025-10-08 20:59:40.556674083 +0000 UTC m=+900.249484766" Oct 08 20:59:40 crc kubenswrapper[4669]: I1008 20:59:40.736754 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:41 crc kubenswrapper[4669]: I1008 20:59:41.117025 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 20:59:41 crc kubenswrapper[4669]: I1008 20:59:41.117070 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 20:59:41 crc kubenswrapper[4669]: I1008 20:59:41.450384 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:41 crc kubenswrapper[4669]: I1008 20:59:41.450451 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:41 crc kubenswrapper[4669]: I1008 20:59:41.787933 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 20:59:41 crc kubenswrapper[4669]: I1008 20:59:41.963191 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.045155 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 20:59:42 crc kubenswrapper[4669]: E1008 20:59:42.045555 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerName="dnsmasq-dns" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.045580 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerName="dnsmasq-dns" Oct 08 20:59:42 crc kubenswrapper[4669]: E1008 20:59:42.045618 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerName="init" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.045627 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerName="init" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.045826 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f476231d-6e83-47b6-9fa4-4714af712b0c" containerName="dnsmasq-dns" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.046909 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.049136 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.049148 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.049460 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.049626 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zjz4h" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.057840 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.170970 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.171006 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/685f13e1-1e56-46fb-b0b4-d850050411d7-config\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.171031 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64qqs\" (UniqueName: \"kubernetes.io/projected/685f13e1-1e56-46fb-b0b4-d850050411d7-kube-api-access-64qqs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.171051 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.171914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/685f13e1-1e56-46fb-b0b4-d850050411d7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.172053 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.172256 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685f13e1-1e56-46fb-b0b4-d850050411d7-scripts\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274263 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685f13e1-1e56-46fb-b0b4-d850050411d7-scripts\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274353 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274377 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/685f13e1-1e56-46fb-b0b4-d850050411d7-config\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274407 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64qqs\" (UniqueName: \"kubernetes.io/projected/685f13e1-1e56-46fb-b0b4-d850050411d7-kube-api-access-64qqs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274429 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274481 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/685f13e1-1e56-46fb-b0b4-d850050411d7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.274516 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.275168 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/685f13e1-1e56-46fb-b0b4-d850050411d7-scripts\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.275269 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/685f13e1-1e56-46fb-b0b4-d850050411d7-config\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.275714 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/685f13e1-1e56-46fb-b0b4-d850050411d7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.280298 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.280889 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.286580 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/685f13e1-1e56-46fb-b0b4-d850050411d7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.294327 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64qqs\" (UniqueName: \"kubernetes.io/projected/685f13e1-1e56-46fb-b0b4-d850050411d7-kube-api-access-64qqs\") pod \"ovn-northd-0\" (UID: \"685f13e1-1e56-46fb-b0b4-d850050411d7\") " pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.375791 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 20:59:42 crc kubenswrapper[4669]: I1008 20:59:42.855454 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.185986 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.186810 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.541328 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"685f13e1-1e56-46fb-b0b4-d850050411d7","Type":"ContainerStarted","Data":"97a496d47d551c42914b4382a298cafa2a08f73e5660168b8b8427994da9daa4"} Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.793935 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zvpft"] Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.794743 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-zvpft" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerName="dnsmasq-dns" containerID="cri-o://0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3" gracePeriod=10 Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.796699 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.846355 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-shjxs"] Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.847666 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.887497 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-shjxs"] Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.903694 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.903742 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs78p\" (UniqueName: \"kubernetes.io/projected/c11ff661-d7fa-45fa-bec1-555472ca36e7-kube-api-access-fs78p\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.903782 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.903826 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-config\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:43 crc kubenswrapper[4669]: I1008 20:59:43.903875 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.005567 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-config\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.005663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.005730 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.005760 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs78p\" (UniqueName: \"kubernetes.io/projected/c11ff661-d7fa-45fa-bec1-555472ca36e7-kube-api-access-fs78p\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.005809 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.006500 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-config\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.006772 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.007054 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.007551 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.028718 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs78p\" (UniqueName: \"kubernetes.io/projected/c11ff661-d7fa-45fa-bec1-555472ca36e7-kube-api-access-fs78p\") pod \"dnsmasq-dns-b8fbc5445-shjxs\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.259341 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.467757 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.514662 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpkjg\" (UniqueName: \"kubernetes.io/projected/46627f9b-9655-4aa9-9f77-d387bbaddb80-kube-api-access-mpkjg\") pod \"46627f9b-9655-4aa9-9f77-d387bbaddb80\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.514731 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-config\") pod \"46627f9b-9655-4aa9-9f77-d387bbaddb80\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.514779 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-sb\") pod \"46627f9b-9655-4aa9-9f77-d387bbaddb80\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.514813 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-dns-svc\") pod \"46627f9b-9655-4aa9-9f77-d387bbaddb80\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.514881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-nb\") pod \"46627f9b-9655-4aa9-9f77-d387bbaddb80\" (UID: \"46627f9b-9655-4aa9-9f77-d387bbaddb80\") " Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.520240 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46627f9b-9655-4aa9-9f77-d387bbaddb80-kube-api-access-mpkjg" (OuterVolumeSpecName: "kube-api-access-mpkjg") pod "46627f9b-9655-4aa9-9f77-d387bbaddb80" (UID: "46627f9b-9655-4aa9-9f77-d387bbaddb80"). InnerVolumeSpecName "kube-api-access-mpkjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.561927 4669 generic.go:334] "Generic (PLEG): container finished" podID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerID="0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3" exitCode=0 Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.562016 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zvpft" event={"ID":"46627f9b-9655-4aa9-9f77-d387bbaddb80","Type":"ContainerDied","Data":"0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3"} Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.562084 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zvpft" event={"ID":"46627f9b-9655-4aa9-9f77-d387bbaddb80","Type":"ContainerDied","Data":"591403def94c87c8e44871971154795b836acde7c1c01bdc0c3aaaebca1a68e7"} Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.562145 4669 scope.go:117] "RemoveContainer" containerID="0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.562375 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zvpft" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.577667 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46627f9b-9655-4aa9-9f77-d387bbaddb80" (UID: "46627f9b-9655-4aa9-9f77-d387bbaddb80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.600328 4669 scope.go:117] "RemoveContainer" containerID="b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.604284 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46627f9b-9655-4aa9-9f77-d387bbaddb80" (UID: "46627f9b-9655-4aa9-9f77-d387bbaddb80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.615765 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-config" (OuterVolumeSpecName: "config") pod "46627f9b-9655-4aa9-9f77-d387bbaddb80" (UID: "46627f9b-9655-4aa9-9f77-d387bbaddb80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.616899 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.616915 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpkjg\" (UniqueName: \"kubernetes.io/projected/46627f9b-9655-4aa9-9f77-d387bbaddb80-kube-api-access-mpkjg\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.616925 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.616934 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.620287 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46627f9b-9655-4aa9-9f77-d387bbaddb80" (UID: "46627f9b-9655-4aa9-9f77-d387bbaddb80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.627935 4669 scope.go:117] "RemoveContainer" containerID="0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3" Oct 08 20:59:44 crc kubenswrapper[4669]: E1008 20:59:44.628579 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3\": container with ID starting with 0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3 not found: ID does not exist" containerID="0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.628624 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3"} err="failed to get container status \"0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3\": rpc error: code = NotFound desc = could not find container \"0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3\": container with ID starting with 0288cb4ce9464faa754b34ce962796b2c4180c3733249b18e498b0bc921919c3 not found: ID does not exist" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.628661 4669 scope.go:117] "RemoveContainer" containerID="b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4" Oct 08 20:59:44 crc kubenswrapper[4669]: E1008 20:59:44.629546 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4\": container with ID starting with b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4 not found: ID does not exist" containerID="b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.629589 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4"} err="failed to get container status \"b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4\": rpc error: code = NotFound desc = could not find container \"b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4\": container with ID starting with b25d0e977012df24b516d85b2957739340eed33492713333bf245fa241329eb4 not found: ID does not exist" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.718326 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46627f9b-9655-4aa9-9f77-d387bbaddb80-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.867720 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-shjxs"] Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.904169 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zvpft"] Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.919360 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zvpft"] Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.929829 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 08 20:59:44 crc kubenswrapper[4669]: E1008 20:59:44.930243 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerName="dnsmasq-dns" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.930264 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerName="dnsmasq-dns" Oct 08 20:59:44 crc kubenswrapper[4669]: E1008 20:59:44.930280 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerName="init" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.930290 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerName="init" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.930540 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" containerName="dnsmasq-dns" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.947128 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.947312 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.949057 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.951126 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.951182 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 08 20:59:44 crc kubenswrapper[4669]: I1008 20:59:44.953166 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xvh6x" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.024261 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.024312 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.024417 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/efef408f-7f0a-4eb1-a9f8-288a9606bf84-cache\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.025285 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv757\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-kube-api-access-xv757\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.025326 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/efef408f-7f0a-4eb1-a9f8-288a9606bf84-lock\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.127850 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.128181 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.128265 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/efef408f-7f0a-4eb1-a9f8-288a9606bf84-cache\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.128300 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv757\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-kube-api-access-xv757\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.128322 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/efef408f-7f0a-4eb1-a9f8-288a9606bf84-lock\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.128814 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/efef408f-7f0a-4eb1-a9f8-288a9606bf84-lock\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.129071 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.130872 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.130909 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.130947 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/efef408f-7f0a-4eb1-a9f8-288a9606bf84-cache\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.130972 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift podName:efef408f-7f0a-4eb1-a9f8-288a9606bf84 nodeName:}" failed. No retries permitted until 2025-10-08 20:59:45.630951224 +0000 UTC m=+905.323761897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift") pod "swift-storage-0" (UID: "efef408f-7f0a-4eb1-a9f8-288a9606bf84") : configmap "swift-ring-files" not found Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.151849 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv757\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-kube-api-access-xv757\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.157509 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.343359 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46627f9b-9655-4aa9-9f77-d387bbaddb80" path="/var/lib/kubelet/pods/46627f9b-9655-4aa9-9f77-d387bbaddb80/volumes" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.435359 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n8ggq"] Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.437379 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.440749 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.440787 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.440917 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.444819 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n8ggq"] Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.480316 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-n8ggq"] Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.480844 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-dzpns ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-dzpns ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-n8ggq" podUID="d453643a-7a41-4a4b-9368-3f2a49cc6b91" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.485062 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-58zf2"] Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.485979 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.504296 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-58zf2"] Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.536908 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-ring-data-devices\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.536977 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-dispersionconf\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-combined-ca-bundle\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537069 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flg27\" (UniqueName: \"kubernetes.io/projected/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-kube-api-access-flg27\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537093 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-swiftconf\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537220 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-scripts\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537331 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-etc-swift\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537368 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-dispersionconf\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537392 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-ring-data-devices\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537575 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-swiftconf\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537613 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-scripts\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537662 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzpns\" (UniqueName: \"kubernetes.io/projected/d453643a-7a41-4a4b-9368-3f2a49cc6b91-kube-api-access-dzpns\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537735 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-combined-ca-bundle\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.537799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d453643a-7a41-4a4b-9368-3f2a49cc6b91-etc-swift\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.569511 4669 generic.go:334] "Generic (PLEG): container finished" podID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerID="e74fe67e0ca563639016760f372e79764f8fc135c9f412652bee412e1d2bed0d" exitCode=0 Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.569580 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" event={"ID":"c11ff661-d7fa-45fa-bec1-555472ca36e7","Type":"ContainerDied","Data":"e74fe67e0ca563639016760f372e79764f8fc135c9f412652bee412e1d2bed0d"} Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.569634 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" event={"ID":"c11ff661-d7fa-45fa-bec1-555472ca36e7","Type":"ContainerStarted","Data":"31c1aa8271f780af9d7c9f8c2e675119feefb6c20df453a2818d64aa59b45a91"} Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.573564 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"685f13e1-1e56-46fb-b0b4-d850050411d7","Type":"ContainerStarted","Data":"043e8fd699d83cbb34f2ae862d0b6879cdb9062ff627ab05cf87e6c0e895348a"} Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.573849 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"685f13e1-1e56-46fb-b0b4-d850050411d7","Type":"ContainerStarted","Data":"aa2b24911c97706e69f3c5bf9acbc489091b6b17e9ffa20d10ba35336c72ec4d"} Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.573998 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.575293 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.582713 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.610771 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.603695388 podStartE2EDuration="3.610749225s" podCreationTimestamp="2025-10-08 20:59:42 +0000 UTC" firstStartedPulling="2025-10-08 20:59:42.857812123 +0000 UTC m=+902.550622806" lastFinishedPulling="2025-10-08 20:59:44.86486596 +0000 UTC m=+904.557676643" observedRunningTime="2025-10-08 20:59:45.604980496 +0000 UTC m=+905.297791169" watchObservedRunningTime="2025-10-08 20:59:45.610749225 +0000 UTC m=+905.303559928" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640504 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-etc-swift\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640588 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-dispersionconf\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640617 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-ring-data-devices\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640688 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-swiftconf\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640713 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-scripts\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640749 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzpns\" (UniqueName: \"kubernetes.io/projected/d453643a-7a41-4a4b-9368-3f2a49cc6b91-kube-api-access-dzpns\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640791 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-combined-ca-bundle\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640816 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d453643a-7a41-4a4b-9368-3f2a49cc6b91-etc-swift\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640855 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-ring-data-devices\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640880 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-dispersionconf\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-combined-ca-bundle\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640933 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flg27\" (UniqueName: \"kubernetes.io/projected/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-kube-api-access-flg27\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640949 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-swiftconf\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.640976 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.641008 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-scripts\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.641098 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-etc-swift\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.641755 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.641774 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.641795 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d453643a-7a41-4a4b-9368-3f2a49cc6b91-etc-swift\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: E1008 20:59:45.641821 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift podName:efef408f-7f0a-4eb1-a9f8-288a9606bf84 nodeName:}" failed. No retries permitted until 2025-10-08 20:59:46.641803158 +0000 UTC m=+906.334613841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift") pod "swift-storage-0" (UID: "efef408f-7f0a-4eb1-a9f8-288a9606bf84") : configmap "swift-ring-files" not found Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.642188 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-ring-data-devices\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.643250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-scripts\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.643641 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-scripts\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.645897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-ring-data-devices\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.645900 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-swiftconf\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.646293 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-dispersionconf\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.646742 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-combined-ca-bundle\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.647512 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-dispersionconf\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.647784 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-swiftconf\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.649414 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-combined-ca-bundle\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.655751 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzpns\" (UniqueName: \"kubernetes.io/projected/d453643a-7a41-4a4b-9368-3f2a49cc6b91-kube-api-access-dzpns\") pod \"swift-ring-rebalance-n8ggq\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.661228 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flg27\" (UniqueName: \"kubernetes.io/projected/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-kube-api-access-flg27\") pod \"swift-ring-rebalance-58zf2\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.741730 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-swiftconf\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.741827 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-combined-ca-bundle\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.741888 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-dispersionconf\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.741925 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d453643a-7a41-4a4b-9368-3f2a49cc6b91-etc-swift\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.742011 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-scripts\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.742041 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-ring-data-devices\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.742062 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzpns\" (UniqueName: \"kubernetes.io/projected/d453643a-7a41-4a4b-9368-3f2a49cc6b91-kube-api-access-dzpns\") pod \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\" (UID: \"d453643a-7a41-4a4b-9368-3f2a49cc6b91\") " Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.742524 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d453643a-7a41-4a4b-9368-3f2a49cc6b91-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.743327 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-scripts" (OuterVolumeSpecName: "scripts") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.743933 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.747202 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.748860 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.749652 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d453643a-7a41-4a4b-9368-3f2a49cc6b91-kube-api-access-dzpns" (OuterVolumeSpecName: "kube-api-access-dzpns") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "kube-api-access-dzpns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.751068 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d453643a-7a41-4a4b-9368-3f2a49cc6b91" (UID: "d453643a-7a41-4a4b-9368-3f2a49cc6b91"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.799795 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.843513 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.843769 4669 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.843861 4669 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d453643a-7a41-4a4b-9368-3f2a49cc6b91-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.843938 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.844010 4669 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d453643a-7a41-4a4b-9368-3f2a49cc6b91-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.844087 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzpns\" (UniqueName: \"kubernetes.io/projected/d453643a-7a41-4a4b-9368-3f2a49cc6b91-kube-api-access-dzpns\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:45 crc kubenswrapper[4669]: I1008 20:59:45.844164 4669 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d453643a-7a41-4a4b-9368-3f2a49cc6b91-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.300496 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-58zf2"] Oct 08 20:59:46 crc kubenswrapper[4669]: W1008 20:59:46.308195 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1096bd8_53d4_4403_9abc_dd7a2c91c1e6.slice/crio-f71ac93b89ce6b7661b180ea0042ade0b4f5408bcb25ba58fe62215f8d62265b WatchSource:0}: Error finding container f71ac93b89ce6b7661b180ea0042ade0b4f5408bcb25ba58fe62215f8d62265b: Status 404 returned error can't find the container with id f71ac93b89ce6b7661b180ea0042ade0b4f5408bcb25ba58fe62215f8d62265b Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.585864 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58zf2" event={"ID":"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6","Type":"ContainerStarted","Data":"f71ac93b89ce6b7661b180ea0042ade0b4f5408bcb25ba58fe62215f8d62265b"} Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.587847 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e47bce4-587b-4864-86ea-a1e2a7987779","Type":"ContainerStarted","Data":"ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0"} Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.587998 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.590360 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n8ggq" Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.590370 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" event={"ID":"c11ff661-d7fa-45fa-bec1-555472ca36e7","Type":"ContainerStarted","Data":"0f2271a3e9828669f523eca195b51ab01eb47be63e9d949f65007052c943e48c"} Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.602759 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.9653643689999996 podStartE2EDuration="33.602736865s" podCreationTimestamp="2025-10-08 20:59:13 +0000 UTC" firstStartedPulling="2025-10-08 20:59:19.089598433 +0000 UTC m=+878.782409106" lastFinishedPulling="2025-10-08 20:59:45.726970929 +0000 UTC m=+905.419781602" observedRunningTime="2025-10-08 20:59:46.599672571 +0000 UTC m=+906.292483264" watchObservedRunningTime="2025-10-08 20:59:46.602736865 +0000 UTC m=+906.295547548" Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.625116 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" podStartSLOduration=3.62509384 podStartE2EDuration="3.62509384s" podCreationTimestamp="2025-10-08 20:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:59:46.623992559 +0000 UTC m=+906.316803232" watchObservedRunningTime="2025-10-08 20:59:46.62509384 +0000 UTC m=+906.317904543" Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.660086 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.663611 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-n8ggq"] Oct 08 20:59:46 crc kubenswrapper[4669]: E1008 20:59:46.664071 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 20:59:46 crc kubenswrapper[4669]: E1008 20:59:46.664095 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 20:59:46 crc kubenswrapper[4669]: E1008 20:59:46.664140 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift podName:efef408f-7f0a-4eb1-a9f8-288a9606bf84 nodeName:}" failed. No retries permitted until 2025-10-08 20:59:48.664122253 +0000 UTC m=+908.356932986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift") pod "swift-storage-0" (UID: "efef408f-7f0a-4eb1-a9f8-288a9606bf84") : configmap "swift-ring-files" not found Oct 08 20:59:46 crc kubenswrapper[4669]: I1008 20:59:46.669225 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-n8ggq"] Oct 08 20:59:47 crc kubenswrapper[4669]: I1008 20:59:47.203299 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 20:59:47 crc kubenswrapper[4669]: I1008 20:59:47.264246 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 20:59:47 crc kubenswrapper[4669]: I1008 20:59:47.346827 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d453643a-7a41-4a4b-9368-3f2a49cc6b91" path="/var/lib/kubelet/pods/d453643a-7a41-4a4b-9368-3f2a49cc6b91/volumes" Oct 08 20:59:47 crc kubenswrapper[4669]: I1008 20:59:47.598923 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:48 crc kubenswrapper[4669]: I1008 20:59:48.692420 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:48 crc kubenswrapper[4669]: E1008 20:59:48.693386 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 20:59:48 crc kubenswrapper[4669]: E1008 20:59:48.693417 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 20:59:48 crc kubenswrapper[4669]: E1008 20:59:48.693490 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift podName:efef408f-7f0a-4eb1-a9f8-288a9606bf84 nodeName:}" failed. No retries permitted until 2025-10-08 20:59:52.693465141 +0000 UTC m=+912.386275854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift") pod "swift-storage-0" (UID: "efef408f-7f0a-4eb1-a9f8-288a9606bf84") : configmap "swift-ring-files" not found Oct 08 20:59:49 crc kubenswrapper[4669]: I1008 20:59:49.542936 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:49 crc kubenswrapper[4669]: I1008 20:59:49.612178 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 20:59:50 crc kubenswrapper[4669]: I1008 20:59:50.627341 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58zf2" event={"ID":"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6","Type":"ContainerStarted","Data":"a11bd2552c63eb4a1b9ae1ba5db660d2bb6118ca4d3d4f43c62e72d80d78881e"} Oct 08 20:59:50 crc kubenswrapper[4669]: I1008 20:59:50.648303 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-58zf2" podStartSLOduration=2.518165402 podStartE2EDuration="5.648281859s" podCreationTimestamp="2025-10-08 20:59:45 +0000 UTC" firstStartedPulling="2025-10-08 20:59:46.312342492 +0000 UTC m=+906.005153195" lastFinishedPulling="2025-10-08 20:59:49.442458959 +0000 UTC m=+909.135269652" observedRunningTime="2025-10-08 20:59:50.643522337 +0000 UTC m=+910.336333010" watchObservedRunningTime="2025-10-08 20:59:50.648281859 +0000 UTC m=+910.341092532" Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.667870 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-t2lt9"] Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.669111 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.708367 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-t2lt9"] Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.849745 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tsn\" (UniqueName: \"kubernetes.io/projected/39e5e134-19bb-471d-b4d2-c3344d70fdd1-kube-api-access-n9tsn\") pod \"keystone-db-create-t2lt9\" (UID: \"39e5e134-19bb-471d-b4d2-c3344d70fdd1\") " pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.876117 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wl4fg"] Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.877902 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.884317 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wl4fg"] Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.951999 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tsn\" (UniqueName: \"kubernetes.io/projected/39e5e134-19bb-471d-b4d2-c3344d70fdd1-kube-api-access-n9tsn\") pod \"keystone-db-create-t2lt9\" (UID: \"39e5e134-19bb-471d-b4d2-c3344d70fdd1\") " pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:51 crc kubenswrapper[4669]: I1008 20:59:51.971108 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tsn\" (UniqueName: \"kubernetes.io/projected/39e5e134-19bb-471d-b4d2-c3344d70fdd1-kube-api-access-n9tsn\") pod \"keystone-db-create-t2lt9\" (UID: \"39e5e134-19bb-471d-b4d2-c3344d70fdd1\") " pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.041282 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.063314 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xk5\" (UniqueName: \"kubernetes.io/projected/0a191427-574e-4eb1-bb13-c6e494b4ca5b-kube-api-access-49xk5\") pod \"placement-db-create-wl4fg\" (UID: \"0a191427-574e-4eb1-bb13-c6e494b4ca5b\") " pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.139417 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zgpxd"] Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.141134 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.150145 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zgpxd"] Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.165252 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xk5\" (UniqueName: \"kubernetes.io/projected/0a191427-574e-4eb1-bb13-c6e494b4ca5b-kube-api-access-49xk5\") pod \"placement-db-create-wl4fg\" (UID: \"0a191427-574e-4eb1-bb13-c6e494b4ca5b\") " pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.182748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xk5\" (UniqueName: \"kubernetes.io/projected/0a191427-574e-4eb1-bb13-c6e494b4ca5b-kube-api-access-49xk5\") pod \"placement-db-create-wl4fg\" (UID: \"0a191427-574e-4eb1-bb13-c6e494b4ca5b\") " pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.193379 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.266770 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qvz\" (UniqueName: \"kubernetes.io/projected/ed80c2fd-2488-49d3-a30a-137a44370e04-kube-api-access-44qvz\") pod \"glance-db-create-zgpxd\" (UID: \"ed80c2fd-2488-49d3-a30a-137a44370e04\") " pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.368945 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qvz\" (UniqueName: \"kubernetes.io/projected/ed80c2fd-2488-49d3-a30a-137a44370e04-kube-api-access-44qvz\") pod \"glance-db-create-zgpxd\" (UID: \"ed80c2fd-2488-49d3-a30a-137a44370e04\") " pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.388494 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qvz\" (UniqueName: \"kubernetes.io/projected/ed80c2fd-2488-49d3-a30a-137a44370e04-kube-api-access-44qvz\") pod \"glance-db-create-zgpxd\" (UID: \"ed80c2fd-2488-49d3-a30a-137a44370e04\") " pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.462710 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.569764 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-t2lt9"] Oct 08 20:59:52 crc kubenswrapper[4669]: W1008 20:59:52.575986 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39e5e134_19bb_471d_b4d2_c3344d70fdd1.slice/crio-7b798196af340ab7de1096db9a9224cf016151fdd77740d41b6cf4ba90e69328 WatchSource:0}: Error finding container 7b798196af340ab7de1096db9a9224cf016151fdd77740d41b6cf4ba90e69328: Status 404 returned error can't find the container with id 7b798196af340ab7de1096db9a9224cf016151fdd77740d41b6cf4ba90e69328 Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.651956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t2lt9" event={"ID":"39e5e134-19bb-471d-b4d2-c3344d70fdd1","Type":"ContainerStarted","Data":"7b798196af340ab7de1096db9a9224cf016151fdd77740d41b6cf4ba90e69328"} Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.694916 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wl4fg"] Oct 08 20:59:52 crc kubenswrapper[4669]: W1008 20:59:52.695141 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a191427_574e_4eb1_bb13_c6e494b4ca5b.slice/crio-15263dae4e271302d3950ad67b67b5d4f88f184a422e342dd417d8bd80874eb0 WatchSource:0}: Error finding container 15263dae4e271302d3950ad67b67b5d4f88f184a422e342dd417d8bd80874eb0: Status 404 returned error can't find the container with id 15263dae4e271302d3950ad67b67b5d4f88f184a422e342dd417d8bd80874eb0 Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.779299 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 20:59:52 crc kubenswrapper[4669]: E1008 20:59:52.779500 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 20:59:52 crc kubenswrapper[4669]: E1008 20:59:52.779545 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 20:59:52 crc kubenswrapper[4669]: E1008 20:59:52.779603 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift podName:efef408f-7f0a-4eb1-a9f8-288a9606bf84 nodeName:}" failed. No retries permitted until 2025-10-08 21:00:00.779583601 +0000 UTC m=+920.472394274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift") pod "swift-storage-0" (UID: "efef408f-7f0a-4eb1-a9f8-288a9606bf84") : configmap "swift-ring-files" not found Oct 08 20:59:52 crc kubenswrapper[4669]: W1008 20:59:52.920640 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded80c2fd_2488_49d3_a30a_137a44370e04.slice/crio-1bfde4ae909d15ef137c93de28d2f33a37c6e1a4c1c782fc24828d060b4329d9 WatchSource:0}: Error finding container 1bfde4ae909d15ef137c93de28d2f33a37c6e1a4c1c782fc24828d060b4329d9: Status 404 returned error can't find the container with id 1bfde4ae909d15ef137c93de28d2f33a37c6e1a4c1c782fc24828d060b4329d9 Oct 08 20:59:52 crc kubenswrapper[4669]: I1008 20:59:52.929677 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zgpxd"] Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.664171 4669 generic.go:334] "Generic (PLEG): container finished" podID="39e5e134-19bb-471d-b4d2-c3344d70fdd1" containerID="6320feee73a5c5a8642b1c4b37fe47aaa8a70d18b00db9af83d8fc321a039cc8" exitCode=0 Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.664233 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t2lt9" event={"ID":"39e5e134-19bb-471d-b4d2-c3344d70fdd1","Type":"ContainerDied","Data":"6320feee73a5c5a8642b1c4b37fe47aaa8a70d18b00db9af83d8fc321a039cc8"} Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.667917 4669 generic.go:334] "Generic (PLEG): container finished" podID="ed80c2fd-2488-49d3-a30a-137a44370e04" containerID="6dd241c048d987bcfc3be789765a5bdfd8191a5c58c43fc41847a94e0f72a11c" exitCode=0 Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.668037 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zgpxd" event={"ID":"ed80c2fd-2488-49d3-a30a-137a44370e04","Type":"ContainerDied","Data":"6dd241c048d987bcfc3be789765a5bdfd8191a5c58c43fc41847a94e0f72a11c"} Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.668071 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zgpxd" event={"ID":"ed80c2fd-2488-49d3-a30a-137a44370e04","Type":"ContainerStarted","Data":"1bfde4ae909d15ef137c93de28d2f33a37c6e1a4c1c782fc24828d060b4329d9"} Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.680093 4669 generic.go:334] "Generic (PLEG): container finished" podID="0a191427-574e-4eb1-bb13-c6e494b4ca5b" containerID="329440f2339893b6701e370e47f52b16dbbf6b0bccaa1c6b33e1e2c63745799c" exitCode=0 Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.680135 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wl4fg" event={"ID":"0a191427-574e-4eb1-bb13-c6e494b4ca5b","Type":"ContainerDied","Data":"329440f2339893b6701e370e47f52b16dbbf6b0bccaa1c6b33e1e2c63745799c"} Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.680160 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wl4fg" event={"ID":"0a191427-574e-4eb1-bb13-c6e494b4ca5b","Type":"ContainerStarted","Data":"15263dae4e271302d3950ad67b67b5d4f88f184a422e342dd417d8bd80874eb0"} Oct 08 20:59:53 crc kubenswrapper[4669]: I1008 20:59:53.730404 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.260917 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.335647 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgh5"] Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.335834 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerName="dnsmasq-dns" containerID="cri-o://7d8074bc0f9a97bc13e593a7e8b74a6b52399d7d4d7812866385710fa4922730" gracePeriod=10 Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.703079 4669 generic.go:334] "Generic (PLEG): container finished" podID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerID="7d8074bc0f9a97bc13e593a7e8b74a6b52399d7d4d7812866385710fa4922730" exitCode=0 Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.703176 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" event={"ID":"2032cb15-6b45-4eca-8ba6-ae485a95a244","Type":"ContainerDied","Data":"7d8074bc0f9a97bc13e593a7e8b74a6b52399d7d4d7812866385710fa4922730"} Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.799979 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.920615 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-dns-svc\") pod \"2032cb15-6b45-4eca-8ba6-ae485a95a244\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.920667 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-config\") pod \"2032cb15-6b45-4eca-8ba6-ae485a95a244\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.920761 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-ovsdbserver-sb\") pod \"2032cb15-6b45-4eca-8ba6-ae485a95a244\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.920790 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmbkg\" (UniqueName: \"kubernetes.io/projected/2032cb15-6b45-4eca-8ba6-ae485a95a244-kube-api-access-nmbkg\") pod \"2032cb15-6b45-4eca-8ba6-ae485a95a244\" (UID: \"2032cb15-6b45-4eca-8ba6-ae485a95a244\") " Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.926915 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2032cb15-6b45-4eca-8ba6-ae485a95a244-kube-api-access-nmbkg" (OuterVolumeSpecName: "kube-api-access-nmbkg") pod "2032cb15-6b45-4eca-8ba6-ae485a95a244" (UID: "2032cb15-6b45-4eca-8ba6-ae485a95a244"). InnerVolumeSpecName "kube-api-access-nmbkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.962318 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-config" (OuterVolumeSpecName: "config") pod "2032cb15-6b45-4eca-8ba6-ae485a95a244" (UID: "2032cb15-6b45-4eca-8ba6-ae485a95a244"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.984786 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2032cb15-6b45-4eca-8ba6-ae485a95a244" (UID: "2032cb15-6b45-4eca-8ba6-ae485a95a244"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:54 crc kubenswrapper[4669]: I1008 20:59:54.985522 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2032cb15-6b45-4eca-8ba6-ae485a95a244" (UID: "2032cb15-6b45-4eca-8ba6-ae485a95a244"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.024686 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.024732 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.024742 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2032cb15-6b45-4eca-8ba6-ae485a95a244-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.024757 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmbkg\" (UniqueName: \"kubernetes.io/projected/2032cb15-6b45-4eca-8ba6-ae485a95a244-kube-api-access-nmbkg\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.040682 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.099702 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.108520 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.126193 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9tsn\" (UniqueName: \"kubernetes.io/projected/39e5e134-19bb-471d-b4d2-c3344d70fdd1-kube-api-access-n9tsn\") pod \"39e5e134-19bb-471d-b4d2-c3344d70fdd1\" (UID: \"39e5e134-19bb-471d-b4d2-c3344d70fdd1\") " Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.130571 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e5e134-19bb-471d-b4d2-c3344d70fdd1-kube-api-access-n9tsn" (OuterVolumeSpecName: "kube-api-access-n9tsn") pod "39e5e134-19bb-471d-b4d2-c3344d70fdd1" (UID: "39e5e134-19bb-471d-b4d2-c3344d70fdd1"). InnerVolumeSpecName "kube-api-access-n9tsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.227418 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49xk5\" (UniqueName: \"kubernetes.io/projected/0a191427-574e-4eb1-bb13-c6e494b4ca5b-kube-api-access-49xk5\") pod \"0a191427-574e-4eb1-bb13-c6e494b4ca5b\" (UID: \"0a191427-574e-4eb1-bb13-c6e494b4ca5b\") " Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.227483 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qvz\" (UniqueName: \"kubernetes.io/projected/ed80c2fd-2488-49d3-a30a-137a44370e04-kube-api-access-44qvz\") pod \"ed80c2fd-2488-49d3-a30a-137a44370e04\" (UID: \"ed80c2fd-2488-49d3-a30a-137a44370e04\") " Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.228237 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9tsn\" (UniqueName: \"kubernetes.io/projected/39e5e134-19bb-471d-b4d2-c3344d70fdd1-kube-api-access-n9tsn\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.229850 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed80c2fd-2488-49d3-a30a-137a44370e04-kube-api-access-44qvz" (OuterVolumeSpecName: "kube-api-access-44qvz") pod "ed80c2fd-2488-49d3-a30a-137a44370e04" (UID: "ed80c2fd-2488-49d3-a30a-137a44370e04"). InnerVolumeSpecName "kube-api-access-44qvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.231244 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a191427-574e-4eb1-bb13-c6e494b4ca5b-kube-api-access-49xk5" (OuterVolumeSpecName: "kube-api-access-49xk5") pod "0a191427-574e-4eb1-bb13-c6e494b4ca5b" (UID: "0a191427-574e-4eb1-bb13-c6e494b4ca5b"). InnerVolumeSpecName "kube-api-access-49xk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.329941 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49xk5\" (UniqueName: \"kubernetes.io/projected/0a191427-574e-4eb1-bb13-c6e494b4ca5b-kube-api-access-49xk5\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.329978 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qvz\" (UniqueName: \"kubernetes.io/projected/ed80c2fd-2488-49d3-a30a-137a44370e04-kube-api-access-44qvz\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.713619 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zgpxd" event={"ID":"ed80c2fd-2488-49d3-a30a-137a44370e04","Type":"ContainerDied","Data":"1bfde4ae909d15ef137c93de28d2f33a37c6e1a4c1c782fc24828d060b4329d9"} Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.713654 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zgpxd" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.713662 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bfde4ae909d15ef137c93de28d2f33a37c6e1a4c1c782fc24828d060b4329d9" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.716885 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wl4fg" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.716923 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wl4fg" event={"ID":"0a191427-574e-4eb1-bb13-c6e494b4ca5b","Type":"ContainerDied","Data":"15263dae4e271302d3950ad67b67b5d4f88f184a422e342dd417d8bd80874eb0"} Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.716980 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15263dae4e271302d3950ad67b67b5d4f88f184a422e342dd417d8bd80874eb0" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.718975 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" event={"ID":"2032cb15-6b45-4eca-8ba6-ae485a95a244","Type":"ContainerDied","Data":"76a507a5084cf8a057fe259b78ebc89f648c87c64fafa22ebdef93beba4f21cc"} Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.719006 4669 scope.go:117] "RemoveContainer" containerID="7d8074bc0f9a97bc13e593a7e8b74a6b52399d7d4d7812866385710fa4922730" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.719337 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgh5" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.720770 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t2lt9" event={"ID":"39e5e134-19bb-471d-b4d2-c3344d70fdd1","Type":"ContainerDied","Data":"7b798196af340ab7de1096db9a9224cf016151fdd77740d41b6cf4ba90e69328"} Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.720798 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b798196af340ab7de1096db9a9224cf016151fdd77740d41b6cf4ba90e69328" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.720801 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t2lt9" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.742588 4669 scope.go:117] "RemoveContainer" containerID="746cffce80af09098020721cf59b702be337803c96c40fd6019c7160356728d1" Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.762431 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgh5"] Oct 08 20:59:55 crc kubenswrapper[4669]: I1008 20:59:55.772070 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgh5"] Oct 08 20:59:56 crc kubenswrapper[4669]: I1008 20:59:56.732347 4669 generic.go:334] "Generic (PLEG): container finished" podID="f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" containerID="a11bd2552c63eb4a1b9ae1ba5db660d2bb6118ca4d3d4f43c62e72d80d78881e" exitCode=0 Oct 08 20:59:56 crc kubenswrapper[4669]: I1008 20:59:56.732423 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58zf2" event={"ID":"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6","Type":"ContainerDied","Data":"a11bd2552c63eb4a1b9ae1ba5db660d2bb6118ca4d3d4f43c62e72d80d78881e"} Oct 08 20:59:57 crc kubenswrapper[4669]: I1008 20:59:57.342364 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" path="/var/lib/kubelet/pods/2032cb15-6b45-4eca-8ba6-ae485a95a244/volumes" Oct 08 20:59:57 crc kubenswrapper[4669]: I1008 20:59:57.432397 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.100714 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58zf2" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180627 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-combined-ca-bundle\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180724 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-swiftconf\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180779 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-dispersionconf\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180804 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-etc-swift\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180892 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flg27\" (UniqueName: \"kubernetes.io/projected/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-kube-api-access-flg27\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180956 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-ring-data-devices\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.180988 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-scripts\") pod \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\" (UID: \"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6\") " Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.181771 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.182051 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.188694 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.188715 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-kube-api-access-flg27" (OuterVolumeSpecName: "kube-api-access-flg27") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "kube-api-access-flg27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.201626 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.203293 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.218591 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-scripts" (OuterVolumeSpecName: "scripts") pod "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" (UID: "f1096bd8-53d4-4403-9abc-dd7a2c91c1e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283313 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flg27\" (UniqueName: \"kubernetes.io/projected/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-kube-api-access-flg27\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283347 4669 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283355 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283363 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283372 4669 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283380 4669 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.283390 4669 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1096bd8-53d4-4403-9abc-dd7a2c91c1e6-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.757445 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58zf2" event={"ID":"f1096bd8-53d4-4403-9abc-dd7a2c91c1e6","Type":"ContainerDied","Data":"f71ac93b89ce6b7661b180ea0042ade0b4f5408bcb25ba58fe62215f8d62265b"} Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.757484 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71ac93b89ce6b7661b180ea0042ade0b4f5408bcb25ba58fe62215f8d62265b" Oct 08 20:59:58 crc kubenswrapper[4669]: I1008 20:59:58.757608 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58zf2" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.146959 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h"] Oct 08 21:00:00 crc kubenswrapper[4669]: E1008 21:00:00.147422 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerName="init" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147445 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerName="init" Oct 08 21:00:00 crc kubenswrapper[4669]: E1008 21:00:00.147465 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerName="dnsmasq-dns" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147477 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerName="dnsmasq-dns" Oct 08 21:00:00 crc kubenswrapper[4669]: E1008 21:00:00.147500 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e5e134-19bb-471d-b4d2-c3344d70fdd1" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147511 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e5e134-19bb-471d-b4d2-c3344d70fdd1" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: E1008 21:00:00.147554 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a191427-574e-4eb1-bb13-c6e494b4ca5b" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147567 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a191427-574e-4eb1-bb13-c6e494b4ca5b" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: E1008 21:00:00.147588 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed80c2fd-2488-49d3-a30a-137a44370e04" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147598 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed80c2fd-2488-49d3-a30a-137a44370e04" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: E1008 21:00:00.147619 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" containerName="swift-ring-rebalance" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147628 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" containerName="swift-ring-rebalance" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147895 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed80c2fd-2488-49d3-a30a-137a44370e04" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147928 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e5e134-19bb-471d-b4d2-c3344d70fdd1" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147947 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1096bd8-53d4-4403-9abc-dd7a2c91c1e6" containerName="swift-ring-rebalance" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147965 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a191427-574e-4eb1-bb13-c6e494b4ca5b" containerName="mariadb-database-create" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.147987 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2032cb15-6b45-4eca-8ba6-ae485a95a244" containerName="dnsmasq-dns" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.148823 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.150936 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.151241 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.155428 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h"] Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.319486 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-secret-volume\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.319577 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfb87\" (UniqueName: \"kubernetes.io/projected/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-kube-api-access-vfb87\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.319626 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-config-volume\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.421380 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-secret-volume\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.421461 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfb87\" (UniqueName: \"kubernetes.io/projected/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-kube-api-access-vfb87\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.421502 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-config-volume\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.423707 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-config-volume\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.426514 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-secret-volume\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.441727 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfb87\" (UniqueName: \"kubernetes.io/projected/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-kube-api-access-vfb87\") pod \"collect-profiles-29332620-8mh8h\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.468057 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.829522 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.836205 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/efef408f-7f0a-4eb1-a9f8-288a9606bf84-etc-swift\") pod \"swift-storage-0\" (UID: \"efef408f-7f0a-4eb1-a9f8-288a9606bf84\") " pod="openstack/swift-storage-0" Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.912699 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h"] Oct 08 21:00:00 crc kubenswrapper[4669]: W1008 21:00:00.924166 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7067c4a6_69cd_456c_aa78_81f45c8cdf7e.slice/crio-26943086b3da3b25a2e33a79b3f15ed135825d577a26e1810693e4b7ef206af1 WatchSource:0}: Error finding container 26943086b3da3b25a2e33a79b3f15ed135825d577a26e1810693e4b7ef206af1: Status 404 returned error can't find the container with id 26943086b3da3b25a2e33a79b3f15ed135825d577a26e1810693e4b7ef206af1 Oct 08 21:00:00 crc kubenswrapper[4669]: I1008 21:00:00.951791 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.485308 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.706858 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0587-account-create-qbln8"] Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.708459 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.711438 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.719036 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0587-account-create-qbln8"] Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.783225 4669 generic.go:334] "Generic (PLEG): container finished" podID="7067c4a6-69cd-456c-aa78-81f45c8cdf7e" containerID="dbc338d1f8d0b38e3f5c5b7f90457ffa425bdeddacac637f15042e29928ab373" exitCode=0 Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.783274 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" event={"ID":"7067c4a6-69cd-456c-aa78-81f45c8cdf7e","Type":"ContainerDied","Data":"dbc338d1f8d0b38e3f5c5b7f90457ffa425bdeddacac637f15042e29928ab373"} Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.784777 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" event={"ID":"7067c4a6-69cd-456c-aa78-81f45c8cdf7e","Type":"ContainerStarted","Data":"26943086b3da3b25a2e33a79b3f15ed135825d577a26e1810693e4b7ef206af1"} Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.785676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"ec85088788ec577df5b28322b03400508d392f9a44eeb1f28ac2f6b50f71aa0a"} Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.847719 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx6gl\" (UniqueName: \"kubernetes.io/projected/52156717-724a-4a76-aee8-73e4029ea3a4-kube-api-access-cx6gl\") pod \"keystone-0587-account-create-qbln8\" (UID: \"52156717-724a-4a76-aee8-73e4029ea3a4\") " pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.949086 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx6gl\" (UniqueName: \"kubernetes.io/projected/52156717-724a-4a76-aee8-73e4029ea3a4-kube-api-access-cx6gl\") pod \"keystone-0587-account-create-qbln8\" (UID: \"52156717-724a-4a76-aee8-73e4029ea3a4\") " pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:01 crc kubenswrapper[4669]: I1008 21:00:01.967224 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx6gl\" (UniqueName: \"kubernetes.io/projected/52156717-724a-4a76-aee8-73e4029ea3a4-kube-api-access-cx6gl\") pod \"keystone-0587-account-create-qbln8\" (UID: \"52156717-724a-4a76-aee8-73e4029ea3a4\") " pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.006326 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-efd5-account-create-vtsr8"] Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.008195 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.010588 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.022488 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-efd5-account-create-vtsr8"] Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.029166 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.152765 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2g47\" (UniqueName: \"kubernetes.io/projected/8b9f456f-98d4-4073-a9e5-78c4426cdc20-kube-api-access-b2g47\") pod \"placement-efd5-account-create-vtsr8\" (UID: \"8b9f456f-98d4-4073-a9e5-78c4426cdc20\") " pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.254831 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2g47\" (UniqueName: \"kubernetes.io/projected/8b9f456f-98d4-4073-a9e5-78c4426cdc20-kube-api-access-b2g47\") pod \"placement-efd5-account-create-vtsr8\" (UID: \"8b9f456f-98d4-4073-a9e5-78c4426cdc20\") " pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.271456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2g47\" (UniqueName: \"kubernetes.io/projected/8b9f456f-98d4-4073-a9e5-78c4426cdc20-kube-api-access-b2g47\") pod \"placement-efd5-account-create-vtsr8\" (UID: \"8b9f456f-98d4-4073-a9e5-78c4426cdc20\") " pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.301969 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4aa4-account-create-g5gbd"] Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.303282 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.305244 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.312286 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4aa4-account-create-g5gbd"] Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.335959 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.457816 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btvnm\" (UniqueName: \"kubernetes.io/projected/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5-kube-api-access-btvnm\") pod \"glance-4aa4-account-create-g5gbd\" (UID: \"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5\") " pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.459469 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0587-account-create-qbln8"] Oct 08 21:00:02 crc kubenswrapper[4669]: W1008 21:00:02.469831 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52156717_724a_4a76_aee8_73e4029ea3a4.slice/crio-241b02e149f1bc5080c495acc50ff1f8e7c6246d2a09e0f299032bfedd5a65ed WatchSource:0}: Error finding container 241b02e149f1bc5080c495acc50ff1f8e7c6246d2a09e0f299032bfedd5a65ed: Status 404 returned error can't find the container with id 241b02e149f1bc5080c495acc50ff1f8e7c6246d2a09e0f299032bfedd5a65ed Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.560203 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btvnm\" (UniqueName: \"kubernetes.io/projected/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5-kube-api-access-btvnm\") pod \"glance-4aa4-account-create-g5gbd\" (UID: \"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5\") " pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.580523 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btvnm\" (UniqueName: \"kubernetes.io/projected/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5-kube-api-access-btvnm\") pod \"glance-4aa4-account-create-g5gbd\" (UID: \"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5\") " pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.624019 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.784207 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-efd5-account-create-vtsr8"] Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.795536 4669 generic.go:334] "Generic (PLEG): container finished" podID="52156717-724a-4a76-aee8-73e4029ea3a4" containerID="33f0e82a3a750fcc4619d98960ce857a29fb7e4077c703ea4c10af050bef930f" exitCode=0 Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.796022 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0587-account-create-qbln8" event={"ID":"52156717-724a-4a76-aee8-73e4029ea3a4","Type":"ContainerDied","Data":"33f0e82a3a750fcc4619d98960ce857a29fb7e4077c703ea4c10af050bef930f"} Oct 08 21:00:02 crc kubenswrapper[4669]: I1008 21:00:02.796054 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0587-account-create-qbln8" event={"ID":"52156717-724a-4a76-aee8-73e4029ea3a4","Type":"ContainerStarted","Data":"241b02e149f1bc5080c495acc50ff1f8e7c6246d2a09e0f299032bfedd5a65ed"} Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.058175 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4aa4-account-create-g5gbd"] Oct 08 21:00:03 crc kubenswrapper[4669]: W1008 21:00:03.063672 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecfd360c_b8d6_4e66_bc00_96b2dd0252f5.slice/crio-c95428c680988ba15fe68e2f0e17ff1281cdd4560e894635060b4eadbce5b1e4 WatchSource:0}: Error finding container c95428c680988ba15fe68e2f0e17ff1281cdd4560e894635060b4eadbce5b1e4: Status 404 returned error can't find the container with id c95428c680988ba15fe68e2f0e17ff1281cdd4560e894635060b4eadbce5b1e4 Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.171932 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.268960 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-config-volume\") pod \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.269148 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-secret-volume\") pod \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.269215 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfb87\" (UniqueName: \"kubernetes.io/projected/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-kube-api-access-vfb87\") pod \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\" (UID: \"7067c4a6-69cd-456c-aa78-81f45c8cdf7e\") " Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.270128 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-config-volume" (OuterVolumeSpecName: "config-volume") pod "7067c4a6-69cd-456c-aa78-81f45c8cdf7e" (UID: "7067c4a6-69cd-456c-aa78-81f45c8cdf7e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.274045 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7067c4a6-69cd-456c-aa78-81f45c8cdf7e" (UID: "7067c4a6-69cd-456c-aa78-81f45c8cdf7e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.274161 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-kube-api-access-vfb87" (OuterVolumeSpecName: "kube-api-access-vfb87") pod "7067c4a6-69cd-456c-aa78-81f45c8cdf7e" (UID: "7067c4a6-69cd-456c-aa78-81f45c8cdf7e"). InnerVolumeSpecName "kube-api-access-vfb87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.370431 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.370460 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfb87\" (UniqueName: \"kubernetes.io/projected/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-kube-api-access-vfb87\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.370470 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7067c4a6-69cd-456c-aa78-81f45c8cdf7e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.804061 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b9f456f-98d4-4073-a9e5-78c4426cdc20" containerID="651fe547f86a83c9a670d0ee6fa2f51d1abe5840616e9b900217192bffcb0a94" exitCode=0 Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.804100 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-efd5-account-create-vtsr8" event={"ID":"8b9f456f-98d4-4073-a9e5-78c4426cdc20","Type":"ContainerDied","Data":"651fe547f86a83c9a670d0ee6fa2f51d1abe5840616e9b900217192bffcb0a94"} Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.804470 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-efd5-account-create-vtsr8" event={"ID":"8b9f456f-98d4-4073-a9e5-78c4426cdc20","Type":"ContainerStarted","Data":"962f65db9e2b5d29182eddda0bf9afba38d2dae056f26f18f2acc668428acbac"} Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.806269 4669 generic.go:334] "Generic (PLEG): container finished" podID="ecfd360c-b8d6-4e66-bc00-96b2dd0252f5" containerID="1c760e09b40b028156b7d84f66b605d10c2b25150422362d3cca468eb4d61bc0" exitCode=0 Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.806316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4aa4-account-create-g5gbd" event={"ID":"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5","Type":"ContainerDied","Data":"1c760e09b40b028156b7d84f66b605d10c2b25150422362d3cca468eb4d61bc0"} Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.806335 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4aa4-account-create-g5gbd" event={"ID":"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5","Type":"ContainerStarted","Data":"c95428c680988ba15fe68e2f0e17ff1281cdd4560e894635060b4eadbce5b1e4"} Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.810050 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.810436 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h" event={"ID":"7067c4a6-69cd-456c-aa78-81f45c8cdf7e","Type":"ContainerDied","Data":"26943086b3da3b25a2e33a79b3f15ed135825d577a26e1810693e4b7ef206af1"} Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.810497 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26943086b3da3b25a2e33a79b3f15ed135825d577a26e1810693e4b7ef206af1" Oct 08 21:00:03 crc kubenswrapper[4669]: I1008 21:00:03.812360 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"7ac00dd5eb6fbd69459e8d7bb7913a3a6fdc481a169e7c641d79287f461ae15f"} Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.206480 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.383932 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx6gl\" (UniqueName: \"kubernetes.io/projected/52156717-724a-4a76-aee8-73e4029ea3a4-kube-api-access-cx6gl\") pod \"52156717-724a-4a76-aee8-73e4029ea3a4\" (UID: \"52156717-724a-4a76-aee8-73e4029ea3a4\") " Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.388830 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52156717-724a-4a76-aee8-73e4029ea3a4-kube-api-access-cx6gl" (OuterVolumeSpecName: "kube-api-access-cx6gl") pod "52156717-724a-4a76-aee8-73e4029ea3a4" (UID: "52156717-724a-4a76-aee8-73e4029ea3a4"). InnerVolumeSpecName "kube-api-access-cx6gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.485664 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx6gl\" (UniqueName: \"kubernetes.io/projected/52156717-724a-4a76-aee8-73e4029ea3a4-kube-api-access-cx6gl\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.824819 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"6dfac10ce159aa27c3b5273ec082e0b42d36a664759c227739f3663a0de858a6"} Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.824857 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"9b50ca8e824c18292b20bd4f52667dce268694b5cf1a25320d7ee5b8196f9510"} Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.824871 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"29b6159dde20eae36d142037366fd3fe1b6a314e2f0a0ed3b60bbbdca8ad5781"} Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.826171 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0587-account-create-qbln8" event={"ID":"52156717-724a-4a76-aee8-73e4029ea3a4","Type":"ContainerDied","Data":"241b02e149f1bc5080c495acc50ff1f8e7c6246d2a09e0f299032bfedd5a65ed"} Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.826220 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0587-account-create-qbln8" Oct 08 21:00:04 crc kubenswrapper[4669]: I1008 21:00:04.826226 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="241b02e149f1bc5080c495acc50ff1f8e7c6246d2a09e0f299032bfedd5a65ed" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.200928 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.206062 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.298995 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btvnm\" (UniqueName: \"kubernetes.io/projected/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5-kube-api-access-btvnm\") pod \"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5\" (UID: \"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5\") " Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.299270 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2g47\" (UniqueName: \"kubernetes.io/projected/8b9f456f-98d4-4073-a9e5-78c4426cdc20-kube-api-access-b2g47\") pod \"8b9f456f-98d4-4073-a9e5-78c4426cdc20\" (UID: \"8b9f456f-98d4-4073-a9e5-78c4426cdc20\") " Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.305807 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5-kube-api-access-btvnm" (OuterVolumeSpecName: "kube-api-access-btvnm") pod "ecfd360c-b8d6-4e66-bc00-96b2dd0252f5" (UID: "ecfd360c-b8d6-4e66-bc00-96b2dd0252f5"). InnerVolumeSpecName "kube-api-access-btvnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.305918 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9f456f-98d4-4073-a9e5-78c4426cdc20-kube-api-access-b2g47" (OuterVolumeSpecName: "kube-api-access-b2g47") pod "8b9f456f-98d4-4073-a9e5-78c4426cdc20" (UID: "8b9f456f-98d4-4073-a9e5-78c4426cdc20"). InnerVolumeSpecName "kube-api-access-b2g47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.400909 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btvnm\" (UniqueName: \"kubernetes.io/projected/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5-kube-api-access-btvnm\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.400951 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2g47\" (UniqueName: \"kubernetes.io/projected/8b9f456f-98d4-4073-a9e5-78c4426cdc20-kube-api-access-b2g47\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.838020 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4aa4-account-create-g5gbd" event={"ID":"ecfd360c-b8d6-4e66-bc00-96b2dd0252f5","Type":"ContainerDied","Data":"c95428c680988ba15fe68e2f0e17ff1281cdd4560e894635060b4eadbce5b1e4"} Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.838059 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95428c680988ba15fe68e2f0e17ff1281cdd4560e894635060b4eadbce5b1e4" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.838076 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4aa4-account-create-g5gbd" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.840333 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"14e3400c66a14b8de8960de0b68a9be5e2dcfc06357a5c1799329f5a0f96b5f6"} Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.842032 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-efd5-account-create-vtsr8" event={"ID":"8b9f456f-98d4-4073-a9e5-78c4426cdc20","Type":"ContainerDied","Data":"962f65db9e2b5d29182eddda0bf9afba38d2dae056f26f18f2acc668428acbac"} Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.842068 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962f65db9e2b5d29182eddda0bf9afba38d2dae056f26f18f2acc668428acbac" Oct 08 21:00:05 crc kubenswrapper[4669]: I1008 21:00:05.842067 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efd5-account-create-vtsr8" Oct 08 21:00:06 crc kubenswrapper[4669]: I1008 21:00:06.853070 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"dd9243c18be1482bce1eda9f620b2362e7213dc42c547881098804772ca6bfd8"} Oct 08 21:00:06 crc kubenswrapper[4669]: I1008 21:00:06.853116 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"5e9b4e4678add77f83ada53e901eeb6c39f94ef8bd0d9c290ae4876ab040ae15"} Oct 08 21:00:06 crc kubenswrapper[4669]: I1008 21:00:06.853133 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"aca62a19efac88a4b379125421be67ca910e5eb72e305323da6967a916136640"} Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.439766 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8b8fw"] Oct 08 21:00:07 crc kubenswrapper[4669]: E1008 21:00:07.440335 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f456f-98d4-4073-a9e5-78c4426cdc20" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.440406 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f456f-98d4-4073-a9e5-78c4426cdc20" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: E1008 21:00:07.440475 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfd360c-b8d6-4e66-bc00-96b2dd0252f5" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.440531 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfd360c-b8d6-4e66-bc00-96b2dd0252f5" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: E1008 21:00:07.440622 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7067c4a6-69cd-456c-aa78-81f45c8cdf7e" containerName="collect-profiles" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.440674 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7067c4a6-69cd-456c-aa78-81f45c8cdf7e" containerName="collect-profiles" Oct 08 21:00:07 crc kubenswrapper[4669]: E1008 21:00:07.440741 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52156717-724a-4a76-aee8-73e4029ea3a4" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.440791 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="52156717-724a-4a76-aee8-73e4029ea3a4" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.440997 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="52156717-724a-4a76-aee8-73e4029ea3a4" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.441092 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7067c4a6-69cd-456c-aa78-81f45c8cdf7e" containerName="collect-profiles" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.441160 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfd360c-b8d6-4e66-bc00-96b2dd0252f5" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.441237 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f456f-98d4-4073-a9e5-78c4426cdc20" containerName="mariadb-account-create" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.441834 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.444879 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2v8q5" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.445100 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.480362 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.498980 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2mvd2" podUID="2e96f6f2-b5f3-49e7-8d84-15d5535963a2" containerName="ovn-controller" probeResult="failure" output=< Oct 08 21:00:07 crc kubenswrapper[4669]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 21:00:07 crc kubenswrapper[4669]: > Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.500294 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wnkk4" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.525273 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8b8fw"] Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.532727 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-config-data\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.532799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-db-sync-config-data\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.532834 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-combined-ca-bundle\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.532875 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bhf7\" (UniqueName: \"kubernetes.io/projected/9cbf7e56-6c38-4ee3-8096-875162b3576f-kube-api-access-6bhf7\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.635371 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-config-data\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.635441 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-db-sync-config-data\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.635482 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-combined-ca-bundle\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.635525 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhf7\" (UniqueName: \"kubernetes.io/projected/9cbf7e56-6c38-4ee3-8096-875162b3576f-kube-api-access-6bhf7\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.641787 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-db-sync-config-data\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.644334 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-config-data\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.645055 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-combined-ca-bundle\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.655766 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhf7\" (UniqueName: \"kubernetes.io/projected/9cbf7e56-6c38-4ee3-8096-875162b3576f-kube-api-access-6bhf7\") pod \"glance-db-sync-8b8fw\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.703795 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mvd2-config-m978c"] Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.705040 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: W1008 21:00:07.707116 4669 reflector.go:561] object-"openstack"/"ovncontroller-extra-scripts": failed to list *v1.ConfigMap: configmaps "ovncontroller-extra-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 08 21:00:07 crc kubenswrapper[4669]: E1008 21:00:07.707164 4669 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovncontroller-extra-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovncontroller-extra-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.718178 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mvd2-config-m978c"] Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.779435 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.838675 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5f9\" (UniqueName: \"kubernetes.io/projected/bd881e8b-bc7c-4432-bb48-53294b45da11-kube-api-access-jv5f9\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.838814 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run-ovn\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.838903 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.839000 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.839044 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.839104 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-log-ovn\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.895795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"cfd2ea86a2755ed2fbc950cd299f6f8dbfb3954929be906c2f6d813f8575227c"} Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.902023 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4f648df-77f7-4480-8b46-3f776880db17" containerID="cd3fd70fc2d5d3707c6cba8aa9868955a85e808c3f7e26a6168a2bbcffde2d37" exitCode=0 Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.902133 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4f648df-77f7-4480-8b46-3f776880db17","Type":"ContainerDied","Data":"cd3fd70fc2d5d3707c6cba8aa9868955a85e808c3f7e26a6168a2bbcffde2d37"} Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.905534 4669 generic.go:334] "Generic (PLEG): container finished" podID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerID="75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab" exitCode=0 Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.906687 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a63d3545-a64d-4c9a-9198-bf11fc782cc6","Type":"ContainerDied","Data":"75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab"} Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940360 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5f9\" (UniqueName: \"kubernetes.io/projected/bd881e8b-bc7c-4432-bb48-53294b45da11-kube-api-access-jv5f9\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940465 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run-ovn\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940503 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940523 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940585 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940625 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-log-ovn\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.940759 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run-ovn\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.941157 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.943106 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.944420 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-log-ovn\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:07 crc kubenswrapper[4669]: I1008 21:00:07.960299 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5f9\" (UniqueName: \"kubernetes.io/projected/bd881e8b-bc7c-4432-bb48-53294b45da11-kube-api-access-jv5f9\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.385970 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8b8fw"] Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.919084 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"46b6198c0051dbf64715de9f6f4cc057e3700f86169425fb0d30292d042a4dd3"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.919129 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"264fb4252019e04e185e8aea3418e8659843a5635180cd0f94218580fcd15f58"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.919140 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"019b8e493ba2b7cde6ec880194025853a688ef760bae467c9e619585b9689b15"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.919149 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"d8fb453423661fc3a19cb48ad6c029ed59d60e1bd37a694bc122d9710a562806"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.919161 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"ad22e3d83a483db795a1f07305de3ec75ef605442d7ca5356e25f6c2ec809bbf"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.919169 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"efef408f-7f0a-4eb1-a9f8-288a9606bf84","Type":"ContainerStarted","Data":"6501062320cf23f0f52f99c46334c9b17bff5e92fc240aa59535f51aee857cb3"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.923144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4f648df-77f7-4480-8b46-3f776880db17","Type":"ContainerStarted","Data":"9fd8fb3e04505e8ce2e2875bca7667bbfaaef9a8494247534bd6d63df3691a48"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.923686 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.928885 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a63d3545-a64d-4c9a-9198-bf11fc782cc6","Type":"ContainerStarted","Data":"24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb"} Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.929128 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.931447 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8b8fw" event={"ID":"9cbf7e56-6c38-4ee3-8096-875162b3576f","Type":"ContainerStarted","Data":"b34632dc7a1639cc31e644409617f45bc30c084aca4ce7cfd8e03f3f71494d7d"} Oct 08 21:00:08 crc kubenswrapper[4669]: E1008 21:00:08.941888 4669 configmap.go:193] Couldn't get configMap openstack/ovncontroller-extra-scripts: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:00:08 crc kubenswrapper[4669]: E1008 21:00:08.942294 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts podName:bd881e8b-bc7c-4432-bb48-53294b45da11 nodeName:}" failed. No retries permitted until 2025-10-08 21:00:09.442270568 +0000 UTC m=+929.135081261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "additional-scripts" (UniqueName: "kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts") pod "ovn-controller-2mvd2-config-m978c" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11") : failed to sync configmap cache: timed out waiting for the condition Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.961738 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.842699464 podStartE2EDuration="25.961718294s" podCreationTimestamp="2025-10-08 20:59:43 +0000 UTC" firstStartedPulling="2025-10-08 21:00:01.510105219 +0000 UTC m=+921.202915882" lastFinishedPulling="2025-10-08 21:00:07.629124039 +0000 UTC m=+927.321934712" observedRunningTime="2025-10-08 21:00:08.956311695 +0000 UTC m=+928.649122378" watchObservedRunningTime="2025-10-08 21:00:08.961718294 +0000 UTC m=+928.654528967" Oct 08 21:00:08 crc kubenswrapper[4669]: I1008 21:00:08.997271 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.810829936 podStartE2EDuration="1m1.997245943s" podCreationTimestamp="2025-10-08 20:59:07 +0000 UTC" firstStartedPulling="2025-10-08 20:59:17.469773491 +0000 UTC m=+877.162584164" lastFinishedPulling="2025-10-08 20:59:28.656189478 +0000 UTC m=+888.349000171" observedRunningTime="2025-10-08 21:00:08.993244724 +0000 UTC m=+928.686055417" watchObservedRunningTime="2025-10-08 21:00:08.997245943 +0000 UTC m=+928.690056616" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.018059 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.764111954 podStartE2EDuration="1m3.018040186s" podCreationTimestamp="2025-10-08 20:59:06 +0000 UTC" firstStartedPulling="2025-10-08 20:59:18.858730965 +0000 UTC m=+878.551541638" lastFinishedPulling="2025-10-08 20:59:29.112659157 +0000 UTC m=+888.805469870" observedRunningTime="2025-10-08 21:00:09.014059677 +0000 UTC m=+928.706870350" watchObservedRunningTime="2025-10-08 21:00:09.018040186 +0000 UTC m=+928.710850869" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.230145 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzpfw"] Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.231796 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.234273 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.245930 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzpfw"] Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.260722 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.260796 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-config\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.260816 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.260861 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.260892 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.260921 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fmx\" (UniqueName: \"kubernetes.io/projected/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-kube-api-access-79fmx\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.262744 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.362392 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-config\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.362444 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.362512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.362638 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.362679 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fmx\" (UniqueName: \"kubernetes.io/projected/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-kube-api-access-79fmx\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.362736 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.363734 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.363837 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.363970 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.363995 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-config\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.364604 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.405457 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fmx\" (UniqueName: \"kubernetes.io/projected/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-kube-api-access-79fmx\") pod \"dnsmasq-dns-6d5b6d6b67-dzpfw\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.464271 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.464994 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts\") pod \"ovn-controller-2mvd2-config-m978c\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.537294 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:09 crc kubenswrapper[4669]: I1008 21:00:09.549149 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.040376 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mvd2-config-m978c"] Oct 08 21:00:10 crc kubenswrapper[4669]: W1008 21:00:10.050605 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd881e8b_bc7c_4432_bb48_53294b45da11.slice/crio-e1588a1bbd1429c3727166aa0c3ed2dd0653a2f5b75ca74471a5f84794aaa554 WatchSource:0}: Error finding container e1588a1bbd1429c3727166aa0c3ed2dd0653a2f5b75ca74471a5f84794aaa554: Status 404 returned error can't find the container with id e1588a1bbd1429c3727166aa0c3ed2dd0653a2f5b75ca74471a5f84794aaa554 Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.152583 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzpfw"] Oct 08 21:00:10 crc kubenswrapper[4669]: W1008 21:00:10.166311 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b8a0c57_b2ab_40f6_b1d3_2f69446ad0cd.slice/crio-97f560f4b3c8f7de8efac111ae52a97440ecba29a29581272d7fdfe3885b2836 WatchSource:0}: Error finding container 97f560f4b3c8f7de8efac111ae52a97440ecba29a29581272d7fdfe3885b2836: Status 404 returned error can't find the container with id 97f560f4b3c8f7de8efac111ae52a97440ecba29a29581272d7fdfe3885b2836 Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.947060 4669 generic.go:334] "Generic (PLEG): container finished" podID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerID="d18124ed67ebe9f4815fec07af3a833b66fabfd7ad40ea4dbb14920b53fc8dbf" exitCode=0 Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.947142 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" event={"ID":"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd","Type":"ContainerDied","Data":"d18124ed67ebe9f4815fec07af3a833b66fabfd7ad40ea4dbb14920b53fc8dbf"} Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.947742 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" event={"ID":"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd","Type":"ContainerStarted","Data":"97f560f4b3c8f7de8efac111ae52a97440ecba29a29581272d7fdfe3885b2836"} Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.950240 4669 generic.go:334] "Generic (PLEG): container finished" podID="bd881e8b-bc7c-4432-bb48-53294b45da11" containerID="c87ef135ae1c63d7bce5bcc12b59351fee15690078ed663152d548cfed0c0be0" exitCode=0 Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.950275 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-m978c" event={"ID":"bd881e8b-bc7c-4432-bb48-53294b45da11","Type":"ContainerDied","Data":"c87ef135ae1c63d7bce5bcc12b59351fee15690078ed663152d548cfed0c0be0"} Oct 08 21:00:10 crc kubenswrapper[4669]: I1008 21:00:10.950291 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-m978c" event={"ID":"bd881e8b-bc7c-4432-bb48-53294b45da11","Type":"ContainerStarted","Data":"e1588a1bbd1429c3727166aa0c3ed2dd0653a2f5b75ca74471a5f84794aaa554"} Oct 08 21:00:11 crc kubenswrapper[4669]: I1008 21:00:11.962232 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" event={"ID":"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd","Type":"ContainerStarted","Data":"53d08d1ee28cd37c456c3a05a6d5b23f416740de7750be9af47c44141d09a776"} Oct 08 21:00:11 crc kubenswrapper[4669]: I1008 21:00:11.962597 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:11 crc kubenswrapper[4669]: I1008 21:00:11.983742 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" podStartSLOduration=2.98371829 podStartE2EDuration="2.98371829s" podCreationTimestamp="2025-10-08 21:00:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:11.980612304 +0000 UTC m=+931.673422977" watchObservedRunningTime="2025-10-08 21:00:11.98371829 +0000 UTC m=+931.676528963" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.254740 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.358214 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2mvd2" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.451527 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-scripts\") pod \"bd881e8b-bc7c-4432-bb48-53294b45da11\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.451603 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run\") pod \"bd881e8b-bc7c-4432-bb48-53294b45da11\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.451710 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv5f9\" (UniqueName: \"kubernetes.io/projected/bd881e8b-bc7c-4432-bb48-53294b45da11-kube-api-access-jv5f9\") pod \"bd881e8b-bc7c-4432-bb48-53294b45da11\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.451906 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run" (OuterVolumeSpecName: "var-run") pod "bd881e8b-bc7c-4432-bb48-53294b45da11" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452064 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-log-ovn\") pod \"bd881e8b-bc7c-4432-bb48-53294b45da11\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452091 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run-ovn\") pod \"bd881e8b-bc7c-4432-bb48-53294b45da11\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452115 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts\") pod \"bd881e8b-bc7c-4432-bb48-53294b45da11\" (UID: \"bd881e8b-bc7c-4432-bb48-53294b45da11\") " Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452177 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "bd881e8b-bc7c-4432-bb48-53294b45da11" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452151 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "bd881e8b-bc7c-4432-bb48-53294b45da11" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452470 4669 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452483 4669 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452497 4669 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd881e8b-bc7c-4432-bb48-53294b45da11-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.452827 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "bd881e8b-bc7c-4432-bb48-53294b45da11" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.453767 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-scripts" (OuterVolumeSpecName: "scripts") pod "bd881e8b-bc7c-4432-bb48-53294b45da11" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.465662 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd881e8b-bc7c-4432-bb48-53294b45da11-kube-api-access-jv5f9" (OuterVolumeSpecName: "kube-api-access-jv5f9") pod "bd881e8b-bc7c-4432-bb48-53294b45da11" (UID: "bd881e8b-bc7c-4432-bb48-53294b45da11"). InnerVolumeSpecName "kube-api-access-jv5f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.554198 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.554237 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv5f9\" (UniqueName: \"kubernetes.io/projected/bd881e8b-bc7c-4432-bb48-53294b45da11-kube-api-access-jv5f9\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.554248 4669 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd881e8b-bc7c-4432-bb48-53294b45da11-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.975040 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-m978c" event={"ID":"bd881e8b-bc7c-4432-bb48-53294b45da11","Type":"ContainerDied","Data":"e1588a1bbd1429c3727166aa0c3ed2dd0653a2f5b75ca74471a5f84794aaa554"} Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.975076 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-m978c" Oct 08 21:00:12 crc kubenswrapper[4669]: I1008 21:00:12.975093 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1588a1bbd1429c3727166aa0c3ed2dd0653a2f5b75ca74471a5f84794aaa554" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.185166 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.185230 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.185272 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.186023 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e212469b959f799f6dd101756cbc798d4bd5c61d90207df29dcf3db6ccbd05d1"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.186108 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://e212469b959f799f6dd101756cbc798d4bd5c61d90207df29dcf3db6ccbd05d1" gracePeriod=600 Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.372491 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mvd2-config-m978c"] Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.379517 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2mvd2-config-m978c"] Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.459696 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mvd2-config-s8sv9"] Oct 08 21:00:13 crc kubenswrapper[4669]: E1008 21:00:13.462425 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd881e8b-bc7c-4432-bb48-53294b45da11" containerName="ovn-config" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.462500 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd881e8b-bc7c-4432-bb48-53294b45da11" containerName="ovn-config" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.462858 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd881e8b-bc7c-4432-bb48-53294b45da11" containerName="ovn-config" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.463586 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.466783 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.472651 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run-ovn\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.472831 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-log-ovn\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.472951 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-additional-scripts\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.473041 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfmm2\" (UniqueName: \"kubernetes.io/projected/bd19aea5-8b8a-4550-8f64-2e3780f7525e-kube-api-access-mfmm2\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.473195 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.473285 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-scripts\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.473735 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mvd2-config-s8sv9"] Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575208 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run-ovn\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575263 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-log-ovn\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575290 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-additional-scripts\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575329 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfmm2\" (UniqueName: \"kubernetes.io/projected/bd19aea5-8b8a-4550-8f64-2e3780f7525e-kube-api-access-mfmm2\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575390 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575417 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-scripts\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.575510 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run-ovn\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.576200 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-additional-scripts\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.576261 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-log-ovn\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.576583 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.579824 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-scripts\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.596904 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfmm2\" (UniqueName: \"kubernetes.io/projected/bd19aea5-8b8a-4550-8f64-2e3780f7525e-kube-api-access-mfmm2\") pod \"ovn-controller-2mvd2-config-s8sv9\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:13 crc kubenswrapper[4669]: I1008 21:00:13.790834 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:14 crc kubenswrapper[4669]: I1008 21:00:14.004359 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="e212469b959f799f6dd101756cbc798d4bd5c61d90207df29dcf3db6ccbd05d1" exitCode=0 Oct 08 21:00:14 crc kubenswrapper[4669]: I1008 21:00:14.008421 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"e212469b959f799f6dd101756cbc798d4bd5c61d90207df29dcf3db6ccbd05d1"} Oct 08 21:00:14 crc kubenswrapper[4669]: I1008 21:00:14.008483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"fc8abec09504bb79a99269d94867e82d2072a920f3270921c1c4a731ac29aaaf"} Oct 08 21:00:14 crc kubenswrapper[4669]: I1008 21:00:14.008511 4669 scope.go:117] "RemoveContainer" containerID="4988e2f99ae9422660aeb112dbeb7f72ef85e0a64c0c7a60db05121da7a422d0" Oct 08 21:00:14 crc kubenswrapper[4669]: I1008 21:00:14.247945 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mvd2-config-s8sv9"] Oct 08 21:00:15 crc kubenswrapper[4669]: I1008 21:00:15.344832 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd881e8b-bc7c-4432-bb48-53294b45da11" path="/var/lib/kubelet/pods/bd881e8b-bc7c-4432-bb48-53294b45da11/volumes" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.146093 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.408844 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-gt2lp"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.409883 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.422792 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gt2lp"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.459709 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.521218 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lkvvk"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.527188 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.556376 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lkvvk"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.564491 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqb7\" (UniqueName: \"kubernetes.io/projected/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7-kube-api-access-dnqb7\") pod \"barbican-db-create-gt2lp\" (UID: \"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7\") " pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.672415 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplgm\" (UniqueName: \"kubernetes.io/projected/a3fe45b8-92b8-47ff-9f71-c908c64e2866-kube-api-access-fplgm\") pod \"cinder-db-create-lkvvk\" (UID: \"a3fe45b8-92b8-47ff-9f71-c908c64e2866\") " pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.672498 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqb7\" (UniqueName: \"kubernetes.io/projected/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7-kube-api-access-dnqb7\") pod \"barbican-db-create-gt2lp\" (UID: \"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7\") " pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.691968 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqb7\" (UniqueName: \"kubernetes.io/projected/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7-kube-api-access-dnqb7\") pod \"barbican-db-create-gt2lp\" (UID: \"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7\") " pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.716251 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-r5bs4"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.717585 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.728361 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-r5bs4"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.731216 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.773059 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-z2w59"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.773833 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fplgm\" (UniqueName: \"kubernetes.io/projected/a3fe45b8-92b8-47ff-9f71-c908c64e2866-kube-api-access-fplgm\") pod \"cinder-db-create-lkvvk\" (UID: \"a3fe45b8-92b8-47ff-9f71-c908c64e2866\") " pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.774914 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.777259 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llg6k" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.777484 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.777677 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.778465 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.787889 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-z2w59"] Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.810825 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplgm\" (UniqueName: \"kubernetes.io/projected/a3fe45b8-92b8-47ff-9f71-c908c64e2866-kube-api-access-fplgm\") pod \"cinder-db-create-lkvvk\" (UID: \"a3fe45b8-92b8-47ff-9f71-c908c64e2866\") " pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.857204 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.875315 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.875558 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2pl\" (UniqueName: \"kubernetes.io/projected/f2134180-0294-4e53-bc38-5d062b5585ff-kube-api-access-8n2pl\") pod \"neutron-db-create-r5bs4\" (UID: \"f2134180-0294-4e53-bc38-5d062b5585ff\") " pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.875753 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rq49\" (UniqueName: \"kubernetes.io/projected/057267cc-ede4-488e-94a2-37caa8cb9557-kube-api-access-2rq49\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.875854 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-combined-ca-bundle\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.977659 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rq49\" (UniqueName: \"kubernetes.io/projected/057267cc-ede4-488e-94a2-37caa8cb9557-kube-api-access-2rq49\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.977742 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-combined-ca-bundle\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.977802 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.977828 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2pl\" (UniqueName: \"kubernetes.io/projected/f2134180-0294-4e53-bc38-5d062b5585ff-kube-api-access-8n2pl\") pod \"neutron-db-create-r5bs4\" (UID: \"f2134180-0294-4e53-bc38-5d062b5585ff\") " pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.983035 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.985065 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-combined-ca-bundle\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:18 crc kubenswrapper[4669]: I1008 21:00:18.995261 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2pl\" (UniqueName: \"kubernetes.io/projected/f2134180-0294-4e53-bc38-5d062b5585ff-kube-api-access-8n2pl\") pod \"neutron-db-create-r5bs4\" (UID: \"f2134180-0294-4e53-bc38-5d062b5585ff\") " pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:19 crc kubenswrapper[4669]: I1008 21:00:19.005195 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rq49\" (UniqueName: \"kubernetes.io/projected/057267cc-ede4-488e-94a2-37caa8cb9557-kube-api-access-2rq49\") pod \"keystone-db-sync-z2w59\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:19 crc kubenswrapper[4669]: I1008 21:00:19.067323 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:19 crc kubenswrapper[4669]: I1008 21:00:19.092893 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:19 crc kubenswrapper[4669]: I1008 21:00:19.551679 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:19 crc kubenswrapper[4669]: I1008 21:00:19.607175 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-shjxs"] Oct 08 21:00:19 crc kubenswrapper[4669]: I1008 21:00:19.607749 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerName="dnsmasq-dns" containerID="cri-o://0f2271a3e9828669f523eca195b51ab01eb47be63e9d949f65007052c943e48c" gracePeriod=10 Oct 08 21:00:20 crc kubenswrapper[4669]: I1008 21:00:20.074245 4669 generic.go:334] "Generic (PLEG): container finished" podID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerID="0f2271a3e9828669f523eca195b51ab01eb47be63e9d949f65007052c943e48c" exitCode=0 Oct 08 21:00:20 crc kubenswrapper[4669]: I1008 21:00:20.074284 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" event={"ID":"c11ff661-d7fa-45fa-bec1-555472ca36e7","Type":"ContainerDied","Data":"0f2271a3e9828669f523eca195b51ab01eb47be63e9d949f65007052c943e48c"} Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.678755 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.832365 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-nb\") pod \"c11ff661-d7fa-45fa-bec1-555472ca36e7\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.832765 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs78p\" (UniqueName: \"kubernetes.io/projected/c11ff661-d7fa-45fa-bec1-555472ca36e7-kube-api-access-fs78p\") pod \"c11ff661-d7fa-45fa-bec1-555472ca36e7\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.832829 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-config\") pod \"c11ff661-d7fa-45fa-bec1-555472ca36e7\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.832874 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-sb\") pod \"c11ff661-d7fa-45fa-bec1-555472ca36e7\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.832975 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-dns-svc\") pod \"c11ff661-d7fa-45fa-bec1-555472ca36e7\" (UID: \"c11ff661-d7fa-45fa-bec1-555472ca36e7\") " Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.849075 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11ff661-d7fa-45fa-bec1-555472ca36e7-kube-api-access-fs78p" (OuterVolumeSpecName: "kube-api-access-fs78p") pod "c11ff661-d7fa-45fa-bec1-555472ca36e7" (UID: "c11ff661-d7fa-45fa-bec1-555472ca36e7"). InnerVolumeSpecName "kube-api-access-fs78p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.892959 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c11ff661-d7fa-45fa-bec1-555472ca36e7" (UID: "c11ff661-d7fa-45fa-bec1-555472ca36e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.901167 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c11ff661-d7fa-45fa-bec1-555472ca36e7" (UID: "c11ff661-d7fa-45fa-bec1-555472ca36e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.904371 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c11ff661-d7fa-45fa-bec1-555472ca36e7" (UID: "c11ff661-d7fa-45fa-bec1-555472ca36e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.905432 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-config" (OuterVolumeSpecName: "config") pod "c11ff661-d7fa-45fa-bec1-555472ca36e7" (UID: "c11ff661-d7fa-45fa-bec1-555472ca36e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.934978 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.935009 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs78p\" (UniqueName: \"kubernetes.io/projected/c11ff661-d7fa-45fa-bec1-555472ca36e7-kube-api-access-fs78p\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.935021 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.935030 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:21 crc kubenswrapper[4669]: I1008 21:00:21.935038 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c11ff661-d7fa-45fa-bec1-555472ca36e7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.046189 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gt2lp"] Oct 08 21:00:22 crc kubenswrapper[4669]: W1008 21:00:22.053041 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffe90a4e_1ec4_4628_bada_7e2d0eedb4d7.slice/crio-bc4c1b6117e10bb9626113ca655aaba3f6f953950213a71409db1be1f91cc7eb WatchSource:0}: Error finding container bc4c1b6117e10bb9626113ca655aaba3f6f953950213a71409db1be1f91cc7eb: Status 404 returned error can't find the container with id bc4c1b6117e10bb9626113ca655aaba3f6f953950213a71409db1be1f91cc7eb Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.092982 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-r5bs4"] Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.094627 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gt2lp" event={"ID":"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7","Type":"ContainerStarted","Data":"bc4c1b6117e10bb9626113ca655aaba3f6f953950213a71409db1be1f91cc7eb"} Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.096205 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-s8sv9" event={"ID":"bd19aea5-8b8a-4550-8f64-2e3780f7525e","Type":"ContainerStarted","Data":"fa86118dfbbefa110fa99cfc57c08478773cb2edc4caf210613dda42aaf58e99"} Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.096257 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-s8sv9" event={"ID":"bd19aea5-8b8a-4550-8f64-2e3780f7525e","Type":"ContainerStarted","Data":"b41cfd06fc12071df8a64a97f54ce28b464b62cb185e61b55dfba92adab09204"} Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.100606 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lkvvk"] Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.106409 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" event={"ID":"c11ff661-d7fa-45fa-bec1-555472ca36e7","Type":"ContainerDied","Data":"31c1aa8271f780af9d7c9f8c2e675119feefb6c20df453a2818d64aa59b45a91"} Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.106466 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-shjxs" Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.106491 4669 scope.go:117] "RemoveContainer" containerID="0f2271a3e9828669f523eca195b51ab01eb47be63e9d949f65007052c943e48c" Oct 08 21:00:22 crc kubenswrapper[4669]: W1008 21:00:22.106829 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2134180_0294_4e53_bc38_5d062b5585ff.slice/crio-c5c59e5962b972b9d175e4c9fd44a2c12ed407aa5b0449540645857a43afbd42 WatchSource:0}: Error finding container c5c59e5962b972b9d175e4c9fd44a2c12ed407aa5b0449540645857a43afbd42: Status 404 returned error can't find the container with id c5c59e5962b972b9d175e4c9fd44a2c12ed407aa5b0449540645857a43afbd42 Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.113074 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2mvd2-config-s8sv9" podStartSLOduration=9.113056309 podStartE2EDuration="9.113056309s" podCreationTimestamp="2025-10-08 21:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:22.112629118 +0000 UTC m=+941.805439811" watchObservedRunningTime="2025-10-08 21:00:22.113056309 +0000 UTC m=+941.805866982" Oct 08 21:00:22 crc kubenswrapper[4669]: W1008 21:00:22.123232 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fe45b8_92b8_47ff_9f71_c908c64e2866.slice/crio-48b2f0d49c9327d7e760a9165a72c04724e6aae7356e4db4591495a212e7f246 WatchSource:0}: Error finding container 48b2f0d49c9327d7e760a9165a72c04724e6aae7356e4db4591495a212e7f246: Status 404 returned error can't find the container with id 48b2f0d49c9327d7e760a9165a72c04724e6aae7356e4db4591495a212e7f246 Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.140689 4669 scope.go:117] "RemoveContainer" containerID="e74fe67e0ca563639016760f372e79764f8fc135c9f412652bee412e1d2bed0d" Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.207214 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-z2w59"] Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.230501 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-shjxs"] Oct 08 21:00:22 crc kubenswrapper[4669]: I1008 21:00:22.237194 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-shjxs"] Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.133646 4669 generic.go:334] "Generic (PLEG): container finished" podID="bd19aea5-8b8a-4550-8f64-2e3780f7525e" containerID="fa86118dfbbefa110fa99cfc57c08478773cb2edc4caf210613dda42aaf58e99" exitCode=0 Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.133970 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-s8sv9" event={"ID":"bd19aea5-8b8a-4550-8f64-2e3780f7525e","Type":"ContainerDied","Data":"fa86118dfbbefa110fa99cfc57c08478773cb2edc4caf210613dda42aaf58e99"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.135657 4669 generic.go:334] "Generic (PLEG): container finished" podID="a3fe45b8-92b8-47ff-9f71-c908c64e2866" containerID="8c6c92a3641e309128c39a1aac34c91c687acb49b952df26c073d7102d6ba9bd" exitCode=0 Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.135706 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lkvvk" event={"ID":"a3fe45b8-92b8-47ff-9f71-c908c64e2866","Type":"ContainerDied","Data":"8c6c92a3641e309128c39a1aac34c91c687acb49b952df26c073d7102d6ba9bd"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.135723 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lkvvk" event={"ID":"a3fe45b8-92b8-47ff-9f71-c908c64e2866","Type":"ContainerStarted","Data":"48b2f0d49c9327d7e760a9165a72c04724e6aae7356e4db4591495a212e7f246"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.137167 4669 generic.go:334] "Generic (PLEG): container finished" podID="ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7" containerID="9e6c17bf14bc007fd2987de2fdc1e6186e2260d168e71924c73c17f12f32d544" exitCode=0 Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.137218 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gt2lp" event={"ID":"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7","Type":"ContainerDied","Data":"9e6c17bf14bc007fd2987de2fdc1e6186e2260d168e71924c73c17f12f32d544"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.139170 4669 generic.go:334] "Generic (PLEG): container finished" podID="f2134180-0294-4e53-bc38-5d062b5585ff" containerID="30a3abfd5c38b5dd4c54dcb1e009858c35074d0e4919ba69b86e4f1215737f9b" exitCode=0 Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.139217 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-r5bs4" event={"ID":"f2134180-0294-4e53-bc38-5d062b5585ff","Type":"ContainerDied","Data":"30a3abfd5c38b5dd4c54dcb1e009858c35074d0e4919ba69b86e4f1215737f9b"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.139232 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-r5bs4" event={"ID":"f2134180-0294-4e53-bc38-5d062b5585ff","Type":"ContainerStarted","Data":"c5c59e5962b972b9d175e4c9fd44a2c12ed407aa5b0449540645857a43afbd42"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.140265 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2w59" event={"ID":"057267cc-ede4-488e-94a2-37caa8cb9557","Type":"ContainerStarted","Data":"d79a6fade9b23c59dec1e908ca5e77af2a685cebf51febfb105afdfd95a1413a"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.142872 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8b8fw" event={"ID":"9cbf7e56-6c38-4ee3-8096-875162b3576f","Type":"ContainerStarted","Data":"f045916f1a7ed045a3ae239aaf04b76d8a44aa518da56654afc20b311f704b4e"} Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.186661 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8b8fw" podStartSLOduration=2.9316397910000003 podStartE2EDuration="16.186641504s" podCreationTimestamp="2025-10-08 21:00:07 +0000 UTC" firstStartedPulling="2025-10-08 21:00:08.387366091 +0000 UTC m=+928.080176764" lastFinishedPulling="2025-10-08 21:00:21.642367814 +0000 UTC m=+941.335178477" observedRunningTime="2025-10-08 21:00:23.184990718 +0000 UTC m=+942.877801401" watchObservedRunningTime="2025-10-08 21:00:23.186641504 +0000 UTC m=+942.879452197" Oct 08 21:00:23 crc kubenswrapper[4669]: I1008 21:00:23.351687 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" path="/var/lib/kubelet/pods/c11ff661-d7fa-45fa-bec1-555472ca36e7/volumes" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.365422 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.372232 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.390995 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.421253 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.521276 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqb7\" (UniqueName: \"kubernetes.io/projected/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7-kube-api-access-dnqb7\") pod \"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7\" (UID: \"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.521738 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-log-ovn\") pod \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.521815 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-scripts\") pod \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.521881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfmm2\" (UniqueName: \"kubernetes.io/projected/bd19aea5-8b8a-4550-8f64-2e3780f7525e-kube-api-access-mfmm2\") pod \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.521984 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "bd19aea5-8b8a-4550-8f64-2e3780f7525e" (UID: "bd19aea5-8b8a-4550-8f64-2e3780f7525e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522019 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run-ovn\") pod \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522054 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "bd19aea5-8b8a-4550-8f64-2e3780f7525e" (UID: "bd19aea5-8b8a-4550-8f64-2e3780f7525e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522069 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run\") pod \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522120 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n2pl\" (UniqueName: \"kubernetes.io/projected/f2134180-0294-4e53-bc38-5d062b5585ff-kube-api-access-8n2pl\") pod \"f2134180-0294-4e53-bc38-5d062b5585ff\" (UID: \"f2134180-0294-4e53-bc38-5d062b5585ff\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522241 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run" (OuterVolumeSpecName: "var-run") pod "bd19aea5-8b8a-4550-8f64-2e3780f7525e" (UID: "bd19aea5-8b8a-4550-8f64-2e3780f7525e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522380 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-additional-scripts\") pod \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\" (UID: \"bd19aea5-8b8a-4550-8f64-2e3780f7525e\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.522463 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fplgm\" (UniqueName: \"kubernetes.io/projected/a3fe45b8-92b8-47ff-9f71-c908c64e2866-kube-api-access-fplgm\") pod \"a3fe45b8-92b8-47ff-9f71-c908c64e2866\" (UID: \"a3fe45b8-92b8-47ff-9f71-c908c64e2866\") " Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.523183 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-scripts" (OuterVolumeSpecName: "scripts") pod "bd19aea5-8b8a-4550-8f64-2e3780f7525e" (UID: "bd19aea5-8b8a-4550-8f64-2e3780f7525e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.523716 4669 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.523788 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.523803 4669 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.523815 4669 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd19aea5-8b8a-4550-8f64-2e3780f7525e-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.524696 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "bd19aea5-8b8a-4550-8f64-2e3780f7525e" (UID: "bd19aea5-8b8a-4550-8f64-2e3780f7525e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.525347 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7-kube-api-access-dnqb7" (OuterVolumeSpecName: "kube-api-access-dnqb7") pod "ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7" (UID: "ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7"). InnerVolumeSpecName "kube-api-access-dnqb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.526766 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2134180-0294-4e53-bc38-5d062b5585ff-kube-api-access-8n2pl" (OuterVolumeSpecName: "kube-api-access-8n2pl") pod "f2134180-0294-4e53-bc38-5d062b5585ff" (UID: "f2134180-0294-4e53-bc38-5d062b5585ff"). InnerVolumeSpecName "kube-api-access-8n2pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.526986 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fe45b8-92b8-47ff-9f71-c908c64e2866-kube-api-access-fplgm" (OuterVolumeSpecName: "kube-api-access-fplgm") pod "a3fe45b8-92b8-47ff-9f71-c908c64e2866" (UID: "a3fe45b8-92b8-47ff-9f71-c908c64e2866"). InnerVolumeSpecName "kube-api-access-fplgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.554753 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd19aea5-8b8a-4550-8f64-2e3780f7525e-kube-api-access-mfmm2" (OuterVolumeSpecName: "kube-api-access-mfmm2") pod "bd19aea5-8b8a-4550-8f64-2e3780f7525e" (UID: "bd19aea5-8b8a-4550-8f64-2e3780f7525e"). InnerVolumeSpecName "kube-api-access-mfmm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.625413 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n2pl\" (UniqueName: \"kubernetes.io/projected/f2134180-0294-4e53-bc38-5d062b5585ff-kube-api-access-8n2pl\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.625449 4669 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd19aea5-8b8a-4550-8f64-2e3780f7525e-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.625463 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fplgm\" (UniqueName: \"kubernetes.io/projected/a3fe45b8-92b8-47ff-9f71-c908c64e2866-kube-api-access-fplgm\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.625476 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqb7\" (UniqueName: \"kubernetes.io/projected/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7-kube-api-access-dnqb7\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:26 crc kubenswrapper[4669]: I1008 21:00:26.625488 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfmm2\" (UniqueName: \"kubernetes.io/projected/bd19aea5-8b8a-4550-8f64-2e3780f7525e-kube-api-access-mfmm2\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.189069 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gt2lp" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.189148 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gt2lp" event={"ID":"ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7","Type":"ContainerDied","Data":"bc4c1b6117e10bb9626113ca655aaba3f6f953950213a71409db1be1f91cc7eb"} Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.189198 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc4c1b6117e10bb9626113ca655aaba3f6f953950213a71409db1be1f91cc7eb" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.191510 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mvd2-config-s8sv9" event={"ID":"bd19aea5-8b8a-4550-8f64-2e3780f7525e","Type":"ContainerDied","Data":"b41cfd06fc12071df8a64a97f54ce28b464b62cb185e61b55dfba92adab09204"} Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.191587 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41cfd06fc12071df8a64a97f54ce28b464b62cb185e61b55dfba92adab09204" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.191650 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mvd2-config-s8sv9" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.209958 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lkvvk" event={"ID":"a3fe45b8-92b8-47ff-9f71-c908c64e2866","Type":"ContainerDied","Data":"48b2f0d49c9327d7e760a9165a72c04724e6aae7356e4db4591495a212e7f246"} Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.209986 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lkvvk" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.210001 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b2f0d49c9327d7e760a9165a72c04724e6aae7356e4db4591495a212e7f246" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.212919 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-r5bs4" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.212945 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-r5bs4" event={"ID":"f2134180-0294-4e53-bc38-5d062b5585ff","Type":"ContainerDied","Data":"c5c59e5962b972b9d175e4c9fd44a2c12ed407aa5b0449540645857a43afbd42"} Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.212988 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5c59e5962b972b9d175e4c9fd44a2c12ed407aa5b0449540645857a43afbd42" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.215735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2w59" event={"ID":"057267cc-ede4-488e-94a2-37caa8cb9557","Type":"ContainerStarted","Data":"1a76b531b8d72c5bae6441b61d77740d192e448d51b9c3ba40a96dbec233e0ed"} Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.406972 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-z2w59" podStartSLOduration=5.373915756 podStartE2EDuration="9.406950263s" podCreationTimestamp="2025-10-08 21:00:18 +0000 UTC" firstStartedPulling="2025-10-08 21:00:22.217796966 +0000 UTC m=+941.910607649" lastFinishedPulling="2025-10-08 21:00:26.250831473 +0000 UTC m=+945.943642156" observedRunningTime="2025-10-08 21:00:27.242864549 +0000 UTC m=+946.935675222" watchObservedRunningTime="2025-10-08 21:00:27.406950263 +0000 UTC m=+947.099760936" Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.480389 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mvd2-config-s8sv9"] Oct 08 21:00:27 crc kubenswrapper[4669]: I1008 21:00:27.489558 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2mvd2-config-s8sv9"] Oct 08 21:00:28 crc kubenswrapper[4669]: I1008 21:00:28.224113 4669 generic.go:334] "Generic (PLEG): container finished" podID="9cbf7e56-6c38-4ee3-8096-875162b3576f" containerID="f045916f1a7ed045a3ae239aaf04b76d8a44aa518da56654afc20b311f704b4e" exitCode=0 Oct 08 21:00:28 crc kubenswrapper[4669]: I1008 21:00:28.225009 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8b8fw" event={"ID":"9cbf7e56-6c38-4ee3-8096-875162b3576f","Type":"ContainerDied","Data":"f045916f1a7ed045a3ae239aaf04b76d8a44aa518da56654afc20b311f704b4e"} Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.233164 4669 generic.go:334] "Generic (PLEG): container finished" podID="057267cc-ede4-488e-94a2-37caa8cb9557" containerID="1a76b531b8d72c5bae6441b61d77740d192e448d51b9c3ba40a96dbec233e0ed" exitCode=0 Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.233217 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2w59" event={"ID":"057267cc-ede4-488e-94a2-37caa8cb9557","Type":"ContainerDied","Data":"1a76b531b8d72c5bae6441b61d77740d192e448d51b9c3ba40a96dbec233e0ed"} Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.347846 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd19aea5-8b8a-4550-8f64-2e3780f7525e" path="/var/lib/kubelet/pods/bd19aea5-8b8a-4550-8f64-2e3780f7525e/volumes" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.666545 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.777954 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bhf7\" (UniqueName: \"kubernetes.io/projected/9cbf7e56-6c38-4ee3-8096-875162b3576f-kube-api-access-6bhf7\") pod \"9cbf7e56-6c38-4ee3-8096-875162b3576f\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.778056 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-config-data\") pod \"9cbf7e56-6c38-4ee3-8096-875162b3576f\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.778101 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-db-sync-config-data\") pod \"9cbf7e56-6c38-4ee3-8096-875162b3576f\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.778137 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-combined-ca-bundle\") pod \"9cbf7e56-6c38-4ee3-8096-875162b3576f\" (UID: \"9cbf7e56-6c38-4ee3-8096-875162b3576f\") " Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.784298 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9cbf7e56-6c38-4ee3-8096-875162b3576f" (UID: "9cbf7e56-6c38-4ee3-8096-875162b3576f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.784412 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbf7e56-6c38-4ee3-8096-875162b3576f-kube-api-access-6bhf7" (OuterVolumeSpecName: "kube-api-access-6bhf7") pod "9cbf7e56-6c38-4ee3-8096-875162b3576f" (UID: "9cbf7e56-6c38-4ee3-8096-875162b3576f"). InnerVolumeSpecName "kube-api-access-6bhf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.803024 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cbf7e56-6c38-4ee3-8096-875162b3576f" (UID: "9cbf7e56-6c38-4ee3-8096-875162b3576f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.826463 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-config-data" (OuterVolumeSpecName: "config-data") pod "9cbf7e56-6c38-4ee3-8096-875162b3576f" (UID: "9cbf7e56-6c38-4ee3-8096-875162b3576f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.881254 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.881290 4669 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.881299 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf7e56-6c38-4ee3-8096-875162b3576f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:29 crc kubenswrapper[4669]: I1008 21:00:29.881309 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bhf7\" (UniqueName: \"kubernetes.io/projected/9cbf7e56-6c38-4ee3-8096-875162b3576f-kube-api-access-6bhf7\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.241087 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8b8fw" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.244618 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8b8fw" event={"ID":"9cbf7e56-6c38-4ee3-8096-875162b3576f","Type":"ContainerDied","Data":"b34632dc7a1639cc31e644409617f45bc30c084aca4ce7cfd8e03f3f71494d7d"} Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.244679 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b34632dc7a1639cc31e644409617f45bc30c084aca4ce7cfd8e03f3f71494d7d" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.495065 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.593185 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rq49\" (UniqueName: \"kubernetes.io/projected/057267cc-ede4-488e-94a2-37caa8cb9557-kube-api-access-2rq49\") pod \"057267cc-ede4-488e-94a2-37caa8cb9557\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.593352 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data\") pod \"057267cc-ede4-488e-94a2-37caa8cb9557\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.593411 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-combined-ca-bundle\") pod \"057267cc-ede4-488e-94a2-37caa8cb9557\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.600424 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057267cc-ede4-488e-94a2-37caa8cb9557-kube-api-access-2rq49" (OuterVolumeSpecName: "kube-api-access-2rq49") pod "057267cc-ede4-488e-94a2-37caa8cb9557" (UID: "057267cc-ede4-488e-94a2-37caa8cb9557"). InnerVolumeSpecName "kube-api-access-2rq49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.663045 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data podName:057267cc-ede4-488e-94a2-37caa8cb9557 nodeName:}" failed. No retries permitted until 2025-10-08 21:00:31.163012811 +0000 UTC m=+950.855823484 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data") pod "057267cc-ede4-488e-94a2-37caa8cb9557" (UID: "057267cc-ede4-488e-94a2-37caa8cb9557") : error deleting /var/lib/kubelet/pods/057267cc-ede4-488e-94a2-37caa8cb9557/volume-subpaths: remove /var/lib/kubelet/pods/057267cc-ede4-488e-94a2-37caa8cb9557/volume-subpaths: no such file or directory Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.669804 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-fzvxk"] Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670152 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd19aea5-8b8a-4550-8f64-2e3780f7525e" containerName="ovn-config" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670171 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd19aea5-8b8a-4550-8f64-2e3780f7525e" containerName="ovn-config" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670187 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2134180-0294-4e53-bc38-5d062b5585ff" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670195 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2134180-0294-4e53-bc38-5d062b5585ff" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670212 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerName="init" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670220 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerName="init" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670232 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerName="dnsmasq-dns" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670240 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerName="dnsmasq-dns" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670247 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fe45b8-92b8-47ff-9f71-c908c64e2866" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670253 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fe45b8-92b8-47ff-9f71-c908c64e2866" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670262 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbf7e56-6c38-4ee3-8096-875162b3576f" containerName="glance-db-sync" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670267 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbf7e56-6c38-4ee3-8096-875162b3576f" containerName="glance-db-sync" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670278 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057267cc-ede4-488e-94a2-37caa8cb9557" containerName="keystone-db-sync" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670283 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="057267cc-ede4-488e-94a2-37caa8cb9557" containerName="keystone-db-sync" Oct 08 21:00:30 crc kubenswrapper[4669]: E1008 21:00:30.670293 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670299 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670436 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2134180-0294-4e53-bc38-5d062b5585ff" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670449 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbf7e56-6c38-4ee3-8096-875162b3576f" containerName="glance-db-sync" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670465 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fe45b8-92b8-47ff-9f71-c908c64e2866" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670473 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="057267cc-ede4-488e-94a2-37caa8cb9557" containerName="keystone-db-sync" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670481 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11ff661-d7fa-45fa-bec1-555472ca36e7" containerName="dnsmasq-dns" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670494 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd19aea5-8b8a-4550-8f64-2e3780f7525e" containerName="ovn-config" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.670505 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7" containerName="mariadb-database-create" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.671321 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.671674 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "057267cc-ede4-488e-94a2-37caa8cb9557" (UID: "057267cc-ede4-488e-94a2-37caa8cb9557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.688919 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-fzvxk"] Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.695468 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.695509 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rq49\" (UniqueName: \"kubernetes.io/projected/057267cc-ede4-488e-94a2-37caa8cb9557-kube-api-access-2rq49\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.797146 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-config\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.797180 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.797204 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.797293 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.797333 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-svc\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.797368 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6nfl\" (UniqueName: \"kubernetes.io/projected/eb75f170-ec61-4ea8-81c4-eb53321ee58f-kube-api-access-d6nfl\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.898929 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-config\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.898984 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.899020 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.899252 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.899952 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.899966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-config\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.900281 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.900322 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.900485 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-svc\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.901268 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-svc\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.901351 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6nfl\" (UniqueName: \"kubernetes.io/projected/eb75f170-ec61-4ea8-81c4-eb53321ee58f-kube-api-access-d6nfl\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:30 crc kubenswrapper[4669]: I1008 21:00:30.918645 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6nfl\" (UniqueName: \"kubernetes.io/projected/eb75f170-ec61-4ea8-81c4-eb53321ee58f-kube-api-access-d6nfl\") pod \"dnsmasq-dns-895cf5cf-fzvxk\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.031426 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.204409 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data\") pod \"057267cc-ede4-488e-94a2-37caa8cb9557\" (UID: \"057267cc-ede4-488e-94a2-37caa8cb9557\") " Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.228029 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data" (OuterVolumeSpecName: "config-data") pod "057267cc-ede4-488e-94a2-37caa8cb9557" (UID: "057267cc-ede4-488e-94a2-37caa8cb9557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.259680 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2w59" event={"ID":"057267cc-ede4-488e-94a2-37caa8cb9557","Type":"ContainerDied","Data":"d79a6fade9b23c59dec1e908ca5e77af2a685cebf51febfb105afdfd95a1413a"} Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.259731 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d79a6fade9b23c59dec1e908ca5e77af2a685cebf51febfb105afdfd95a1413a" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.259808 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2w59" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.306358 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057267cc-ede4-488e-94a2-37caa8cb9557-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.522922 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-fzvxk"] Oct 08 21:00:31 crc kubenswrapper[4669]: W1008 21:00:31.526329 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb75f170_ec61_4ea8_81c4_eb53321ee58f.slice/crio-3f2373ffcdf9fb7ba934e2a7283c72c656395baabc29085c04f6f0d634abbdaa WatchSource:0}: Error finding container 3f2373ffcdf9fb7ba934e2a7283c72c656395baabc29085c04f6f0d634abbdaa: Status 404 returned error can't find the container with id 3f2373ffcdf9fb7ba934e2a7283c72c656395baabc29085c04f6f0d634abbdaa Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.739501 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rxss2"] Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.742215 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.748895 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.749075 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.749236 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.749942 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llg6k" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.771258 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rxss2"] Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.781174 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-fzvxk"] Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.833962 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-7n5jz"] Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.836286 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.854328 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-7n5jz"] Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.915016 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-credential-keys\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.915071 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-scripts\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.915127 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwjt\" (UniqueName: \"kubernetes.io/projected/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-kube-api-access-bgwjt\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.915154 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-config-data\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.915173 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-combined-ca-bundle\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.915195 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-fernet-keys\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.980615 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-574f4fd7c7-7zknn"] Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.981900 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.998410 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.999016 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 08 21:00:31 crc kubenswrapper[4669]: I1008 21:00:31.999350 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:31.999644 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-bt9z6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016489 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-fernet-keys\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016588 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-config\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016616 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016672 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-credential-keys\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016723 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-scripts\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016774 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016811 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016847 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwjt\" (UniqueName: \"kubernetes.io/projected/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-kube-api-access-bgwjt\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016880 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfqv\" (UniqueName: \"kubernetes.io/projected/c58b2710-b460-48ba-9106-5928fbcff443-kube-api-access-pdfqv\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016909 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-config-data\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016941 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-combined-ca-bundle\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.016966 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.018471 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-574f4fd7c7-7zknn"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.036471 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-config-data\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.038037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-scripts\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.045192 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-combined-ca-bundle\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.056300 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-credential-keys\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.064071 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwjt\" (UniqueName: \"kubernetes.io/projected/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-kube-api-access-bgwjt\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.071883 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-fernet-keys\") pod \"keystone-bootstrap-rxss2\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.112179 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-p2bl6"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118369 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118437 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-config\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118492 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-scripts\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118546 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-config-data\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118563 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79cb9326-1639-44ff-973d-adbce6c8098b-horizon-secret-key\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8npbw\" (UniqueName: \"kubernetes.io/projected/79cb9326-1639-44ff-973d-adbce6c8098b-kube-api-access-8npbw\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118611 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118636 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118662 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cb9326-1639-44ff-973d-adbce6c8098b-logs\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.118685 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfqv\" (UniqueName: \"kubernetes.io/projected/c58b2710-b460-48ba-9106-5928fbcff443-kube-api-access-pdfqv\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.119227 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.119579 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.119637 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-config\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.119829 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.120438 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.122753 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f4x62" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.122961 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.123584 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.124030 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.146189 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-p2bl6"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.161749 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfqv\" (UniqueName: \"kubernetes.io/projected/c58b2710-b460-48ba-9106-5928fbcff443-kube-api-access-pdfqv\") pod \"dnsmasq-dns-6c9c9f998c-7n5jz\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.168588 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.170478 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.174797 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.175010 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.190896 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.200463 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-7n5jz"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.201435 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223019 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-config-data\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223064 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-scripts\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223084 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469707ec-e817-4d42-b406-6595799f6036-logs\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223101 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-combined-ca-bundle\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223136 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79cb9326-1639-44ff-973d-adbce6c8098b-horizon-secret-key\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223153 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-config-data\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223177 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8npbw\" (UniqueName: \"kubernetes.io/projected/79cb9326-1639-44ff-973d-adbce6c8098b-kube-api-access-8npbw\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223206 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvjx2\" (UniqueName: \"kubernetes.io/projected/469707ec-e817-4d42-b406-6595799f6036-kube-api-access-cvjx2\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223228 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cb9326-1639-44ff-973d-adbce6c8098b-logs\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223279 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-scripts\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.223882 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-scripts\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.237007 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-config-data\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.237257 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cb9326-1639-44ff-973d-adbce6c8098b-logs\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.238823 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79cb9326-1639-44ff-973d-adbce6c8098b-horizon-secret-key\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.299212 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8npbw\" (UniqueName: \"kubernetes.io/projected/79cb9326-1639-44ff-973d-adbce6c8098b-kube-api-access-8npbw\") pod \"horizon-574f4fd7c7-7zknn\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325125 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvjx2\" (UniqueName: \"kubernetes.io/projected/469707ec-e817-4d42-b406-6595799f6036-kube-api-access-cvjx2\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325175 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-run-httpd\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325192 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-scripts\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325247 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-scripts\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325273 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325289 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-config-data\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469707ec-e817-4d42-b406-6595799f6036-logs\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325330 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-combined-ca-bundle\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325352 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325389 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-log-httpd\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325409 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gdb\" (UniqueName: \"kubernetes.io/projected/e462fc4e-635f-4e2e-88c0-43f1af0dc648-kube-api-access-m4gdb\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.325443 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-config-data\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.335825 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55cd7b4cc7-2w7vd"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.338210 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469707ec-e817-4d42-b406-6595799f6036-logs\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.343062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-scripts\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.343440 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.347095 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-config-data\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.347615 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-combined-ca-bundle\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.348447 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvjx2\" (UniqueName: \"kubernetes.io/projected/469707ec-e817-4d42-b406-6595799f6036-kube-api-access-cvjx2\") pod \"placement-db-sync-p2bl6\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.359476 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jnx9v"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.361632 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.370340 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55cd7b4cc7-2w7vd"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.371670 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.411055 4669 generic.go:334] "Generic (PLEG): container finished" podID="eb75f170-ec61-4ea8-81c4-eb53321ee58f" containerID="1ea454be88024c6d62898fb63d36ca1c54941dc50fe44c9049bfe3cd983065a7" exitCode=0 Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.411110 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" event={"ID":"eb75f170-ec61-4ea8-81c4-eb53321ee58f","Type":"ContainerDied","Data":"1ea454be88024c6d62898fb63d36ca1c54941dc50fe44c9049bfe3cd983065a7"} Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.411180 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" event={"ID":"eb75f170-ec61-4ea8-81c4-eb53321ee58f","Type":"ContainerStarted","Data":"3f2373ffcdf9fb7ba934e2a7283c72c656395baabc29085c04f6f0d634abbdaa"} Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.429850 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-log-httpd\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.429893 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gdb\" (UniqueName: \"kubernetes.io/projected/e462fc4e-635f-4e2e-88c0-43f1af0dc648-kube-api-access-m4gdb\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.429935 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-config-data\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.429959 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-run-httpd\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.429978 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-scripts\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.430049 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.430082 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.440021 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.449403 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.468272 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jnx9v"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.486930 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-run-httpd\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.521359 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-log-httpd\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.523281 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.532911 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-scripts\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537084 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537133 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537229 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2pxt\" (UniqueName: \"kubernetes.io/projected/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-kube-api-access-t2pxt\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537300 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537432 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-config-data\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537461 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqx5r\" (UniqueName: \"kubernetes.io/projected/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-kube-api-access-wqx5r\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537485 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-config\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537515 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-horizon-secret-key\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537578 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-logs\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537638 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-scripts\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.537788 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.542869 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gdb\" (UniqueName: \"kubernetes.io/projected/e462fc4e-635f-4e2e-88c0-43f1af0dc648-kube-api-access-m4gdb\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.547639 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-config-data\") pod \"ceilometer-0\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.581138 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.582430 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.585966 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.591041 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.591142 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2v8q5" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.596107 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p2bl6" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.609602 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.626117 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639082 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-scripts\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639148 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639172 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639194 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639231 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2pxt\" (UniqueName: \"kubernetes.io/projected/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-kube-api-access-t2pxt\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639252 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639297 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-config-data\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639317 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqx5r\" (UniqueName: \"kubernetes.io/projected/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-kube-api-access-wqx5r\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639332 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-config\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639350 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-horizon-secret-key\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.639386 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-logs\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.640682 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-logs\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.641017 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.641253 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.641644 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-scripts\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.641703 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-config\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.642308 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.642673 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.642860 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-config-data\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.656081 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-horizon-secret-key\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.659512 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.660774 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2pxt\" (UniqueName: \"kubernetes.io/projected/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-kube-api-access-t2pxt\") pod \"dnsmasq-dns-57c957c4ff-jnx9v\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.666893 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.671110 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.671237 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.685603 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqx5r\" (UniqueName: \"kubernetes.io/projected/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-kube-api-access-wqx5r\") pod \"horizon-55cd7b4cc7-2w7vd\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.741396 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.741506 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-scripts\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.741559 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.741994 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-logs\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.742038 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z89xm\" (UniqueName: \"kubernetes.io/projected/416fb1ca-1386-410b-aaaa-3e0ec724d461-kube-api-access-z89xm\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.742125 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-config-data\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.742214 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.750396 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.789320 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.848447 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.848497 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-config-data\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.848578 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.848642 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-logs\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.848673 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z89xm\" (UniqueName: \"kubernetes.io/projected/416fb1ca-1386-410b-aaaa-3e0ec724d461-kube-api-access-z89xm\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.848755 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-logs\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-config-data\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849440 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849475 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49gd\" (UniqueName: \"kubernetes.io/projected/e916b41d-d3e7-43b8-b550-8ad07a4cf147-kube-api-access-b49gd\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849561 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849592 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-scripts\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849638 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849660 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849682 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-scripts\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.849842 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-logs\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.850102 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.850728 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.858730 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-config-data\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.862499 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.868223 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-scripts\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.896576 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z89xm\" (UniqueName: \"kubernetes.io/projected/416fb1ca-1386-410b-aaaa-3e0ec724d461-kube-api-access-z89xm\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.897408 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.951783 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952716 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49gd\" (UniqueName: \"kubernetes.io/projected/e916b41d-d3e7-43b8-b550-8ad07a4cf147-kube-api-access-b49gd\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952759 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-scripts\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952788 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952805 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952836 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-config-data\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952858 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.952913 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-logs\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.953748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.954651 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.960136 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-logs\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.960949 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-config-data\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.961453 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-scripts\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.982183 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.983360 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49gd\" (UniqueName: \"kubernetes.io/projected/e916b41d-d3e7-43b8-b550-8ad07a4cf147-kube-api-access-b49gd\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:32 crc kubenswrapper[4669]: I1008 21:00:32.991555 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-7n5jz"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.043130 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.053960 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-svc\") pod \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.054100 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-config\") pod \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.054128 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6nfl\" (UniqueName: \"kubernetes.io/projected/eb75f170-ec61-4ea8-81c4-eb53321ee58f-kube-api-access-d6nfl\") pod \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.054155 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-nb\") pod \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.054308 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-swift-storage-0\") pod \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.054361 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-sb\") pod \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\" (UID: \"eb75f170-ec61-4ea8-81c4-eb53321ee58f\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.080926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb75f170-ec61-4ea8-81c4-eb53321ee58f-kube-api-access-d6nfl" (OuterVolumeSpecName: "kube-api-access-d6nfl") pod "eb75f170-ec61-4ea8-81c4-eb53321ee58f" (UID: "eb75f170-ec61-4ea8-81c4-eb53321ee58f"). InnerVolumeSpecName "kube-api-access-d6nfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.085210 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb75f170-ec61-4ea8-81c4-eb53321ee58f" (UID: "eb75f170-ec61-4ea8-81c4-eb53321ee58f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.091446 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb75f170-ec61-4ea8-81c4-eb53321ee58f" (UID: "eb75f170-ec61-4ea8-81c4-eb53321ee58f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.106698 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb75f170-ec61-4ea8-81c4-eb53321ee58f" (UID: "eb75f170-ec61-4ea8-81c4-eb53321ee58f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.107855 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.116273 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-config" (OuterVolumeSpecName: "config") pod "eb75f170-ec61-4ea8-81c4-eb53321ee58f" (UID: "eb75f170-ec61-4ea8-81c4-eb53321ee58f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.124722 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb75f170-ec61-4ea8-81c4-eb53321ee58f" (UID: "eb75f170-ec61-4ea8-81c4-eb53321ee58f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.125714 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.156721 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.156821 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.156856 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6nfl\" (UniqueName: \"kubernetes.io/projected/eb75f170-ec61-4ea8-81c4-eb53321ee58f-kube-api-access-d6nfl\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.156885 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.156899 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.156910 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb75f170-ec61-4ea8-81c4-eb53321ee58f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.265658 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rxss2"] Oct 08 21:00:33 crc kubenswrapper[4669]: W1008 21:00:33.286773 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb94dc5a1_30ee_4ce3_807d_51f22cb32b5f.slice/crio-5e1532bd10db2990fd2e46ab57a43e54a7c919360ac302dd409ec148b306ed35 WatchSource:0}: Error finding container 5e1532bd10db2990fd2e46ab57a43e54a7c919360ac302dd409ec148b306ed35: Status 404 returned error can't find the container with id 5e1532bd10db2990fd2e46ab57a43e54a7c919360ac302dd409ec148b306ed35 Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.352001 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.360066 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-574f4fd7c7-7zknn"] Oct 08 21:00:33 crc kubenswrapper[4669]: W1008 21:00:33.366866 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode462fc4e_635f_4e2e_88c0_43f1af0dc648.slice/crio-b44142857e977940082c8433b0ad224c252fbccc15b141b45cf6602e63297811 WatchSource:0}: Error finding container b44142857e977940082c8433b0ad224c252fbccc15b141b45cf6602e63297811: Status 404 returned error can't find the container with id b44142857e977940082c8433b0ad224c252fbccc15b141b45cf6602e63297811 Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.371658 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-p2bl6"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.419126 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rxss2" event={"ID":"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f","Type":"ContainerStarted","Data":"5e1532bd10db2990fd2e46ab57a43e54a7c919360ac302dd409ec148b306ed35"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.420561 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p2bl6" event={"ID":"469707ec-e817-4d42-b406-6595799f6036","Type":"ContainerStarted","Data":"641ec32f5da7e530873f54f6d5fccc9f78786edfcd2a87dea594f240eb796da8"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.422723 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerStarted","Data":"b44142857e977940082c8433b0ad224c252fbccc15b141b45cf6602e63297811"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.424083 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574f4fd7c7-7zknn" event={"ID":"79cb9326-1639-44ff-973d-adbce6c8098b","Type":"ContainerStarted","Data":"059abaefb495ce0badb0de442e1bc90b77428a6ad8a1e6e32cb7659c41eb8f3b"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.425940 4669 generic.go:334] "Generic (PLEG): container finished" podID="c58b2710-b460-48ba-9106-5928fbcff443" containerID="ad267f9e811f6afd4edfc8f565585960b9c3fe114aaa7d382dae3971cf064e71" exitCode=0 Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.425993 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" event={"ID":"c58b2710-b460-48ba-9106-5928fbcff443","Type":"ContainerDied","Data":"ad267f9e811f6afd4edfc8f565585960b9c3fe114aaa7d382dae3971cf064e71"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.426011 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" event={"ID":"c58b2710-b460-48ba-9106-5928fbcff443","Type":"ContainerStarted","Data":"de596c798341e64c364b3a0505c7f90f2f08364612d210c2c0ba81875583b4f7"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.428917 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" event={"ID":"eb75f170-ec61-4ea8-81c4-eb53321ee58f","Type":"ContainerDied","Data":"3f2373ffcdf9fb7ba934e2a7283c72c656395baabc29085c04f6f0d634abbdaa"} Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.428982 4669 scope.go:117] "RemoveContainer" containerID="1ea454be88024c6d62898fb63d36ca1c54941dc50fe44c9049bfe3cd983065a7" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.429013 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-fzvxk" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.494344 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-fzvxk"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.501091 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-fzvxk"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.629925 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55cd7b4cc7-2w7vd"] Oct 08 21:00:33 crc kubenswrapper[4669]: W1008 21:00:33.641817 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc019d0a1_4d0f_40e8_8720_f20b74d33b4b.slice/crio-fa74df3317031d2aefab15b1a27751fb618e603643c06f9c4f9237ca96c6db27 WatchSource:0}: Error finding container fa74df3317031d2aefab15b1a27751fb618e603643c06f9c4f9237ca96c6db27: Status 404 returned error can't find the container with id fa74df3317031d2aefab15b1a27751fb618e603643c06f9c4f9237ca96c6db27 Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.700255 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jnx9v"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.758397 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.859332 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.916727 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:33 crc kubenswrapper[4669]: W1008 21:00:33.925671 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode916b41d_d3e7_43b8_b550_8ad07a4cf147.slice/crio-d2b98603e2c3e0b7fb624d6dc3aa3b476b121f2153b3d6e061b3ccb5a879687c WatchSource:0}: Error finding container d2b98603e2c3e0b7fb624d6dc3aa3b476b121f2153b3d6e061b3ccb5a879687c: Status 404 returned error can't find the container with id d2b98603e2c3e0b7fb624d6dc3aa3b476b121f2153b3d6e061b3ccb5a879687c Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.987265 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-sb\") pod \"c58b2710-b460-48ba-9106-5928fbcff443\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.987647 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-config\") pod \"c58b2710-b460-48ba-9106-5928fbcff443\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.987717 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-nb\") pod \"c58b2710-b460-48ba-9106-5928fbcff443\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.987767 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-svc\") pod \"c58b2710-b460-48ba-9106-5928fbcff443\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.987807 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-swift-storage-0\") pod \"c58b2710-b460-48ba-9106-5928fbcff443\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.987863 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdfqv\" (UniqueName: \"kubernetes.io/projected/c58b2710-b460-48ba-9106-5928fbcff443-kube-api-access-pdfqv\") pod \"c58b2710-b460-48ba-9106-5928fbcff443\" (UID: \"c58b2710-b460-48ba-9106-5928fbcff443\") " Oct 08 21:00:33 crc kubenswrapper[4669]: I1008 21:00:33.992766 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58b2710-b460-48ba-9106-5928fbcff443-kube-api-access-pdfqv" (OuterVolumeSpecName: "kube-api-access-pdfqv") pod "c58b2710-b460-48ba-9106-5928fbcff443" (UID: "c58b2710-b460-48ba-9106-5928fbcff443"). InnerVolumeSpecName "kube-api-access-pdfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.013290 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c58b2710-b460-48ba-9106-5928fbcff443" (UID: "c58b2710-b460-48ba-9106-5928fbcff443"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.016499 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c58b2710-b460-48ba-9106-5928fbcff443" (UID: "c58b2710-b460-48ba-9106-5928fbcff443"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.020477 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-config" (OuterVolumeSpecName: "config") pod "c58b2710-b460-48ba-9106-5928fbcff443" (UID: "c58b2710-b460-48ba-9106-5928fbcff443"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.025711 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c58b2710-b460-48ba-9106-5928fbcff443" (UID: "c58b2710-b460-48ba-9106-5928fbcff443"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.030013 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c58b2710-b460-48ba-9106-5928fbcff443" (UID: "c58b2710-b460-48ba-9106-5928fbcff443"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.089377 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.089408 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.089419 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.089428 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdfqv\" (UniqueName: \"kubernetes.io/projected/c58b2710-b460-48ba-9106-5928fbcff443-kube-api-access-pdfqv\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.089437 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.089447 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58b2710-b460-48ba-9106-5928fbcff443-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.447056 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e916b41d-d3e7-43b8-b550-8ad07a4cf147","Type":"ContainerStarted","Data":"d2b98603e2c3e0b7fb624d6dc3aa3b476b121f2153b3d6e061b3ccb5a879687c"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.449318 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cd7b4cc7-2w7vd" event={"ID":"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d","Type":"ContainerStarted","Data":"299206eb36cdea4f2b2fa5076bb7a9d61ed0a8314e4091f7f6abcdbb7d57f2e3"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.456693 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.456722 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-7n5jz" event={"ID":"c58b2710-b460-48ba-9106-5928fbcff443","Type":"ContainerDied","Data":"de596c798341e64c364b3a0505c7f90f2f08364612d210c2c0ba81875583b4f7"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.456778 4669 scope.go:117] "RemoveContainer" containerID="ad267f9e811f6afd4edfc8f565585960b9c3fe114aaa7d382dae3971cf064e71" Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.466894 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"416fb1ca-1386-410b-aaaa-3e0ec724d461","Type":"ContainerStarted","Data":"5aabb969d569cdd3b4d065ae60447d269b35b26148e46ba0240d38f951a385b4"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.473923 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rxss2" event={"ID":"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f","Type":"ContainerStarted","Data":"0df0a205510b9f2322e2df5512943b74d26c823c6537d133620954f20dce5aaa"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.475941 4669 generic.go:334] "Generic (PLEG): container finished" podID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerID="b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760" exitCode=0 Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.475973 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" event={"ID":"c019d0a1-4d0f-40e8-8720-f20b74d33b4b","Type":"ContainerDied","Data":"b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.476002 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" event={"ID":"c019d0a1-4d0f-40e8-8720-f20b74d33b4b","Type":"ContainerStarted","Data":"fa74df3317031d2aefab15b1a27751fb618e603643c06f9c4f9237ca96c6db27"} Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.530851 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-7n5jz"] Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.542945 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-7n5jz"] Oct 08 21:00:34 crc kubenswrapper[4669]: I1008 21:00:34.545571 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rxss2" podStartSLOduration=3.545557218 podStartE2EDuration="3.545557218s" podCreationTimestamp="2025-10-08 21:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:34.530191264 +0000 UTC m=+954.223001937" watchObservedRunningTime="2025-10-08 21:00:34.545557218 +0000 UTC m=+954.238367891" Oct 08 21:00:35 crc kubenswrapper[4669]: I1008 21:00:35.349388 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58b2710-b460-48ba-9106-5928fbcff443" path="/var/lib/kubelet/pods/c58b2710-b460-48ba-9106-5928fbcff443/volumes" Oct 08 21:00:35 crc kubenswrapper[4669]: I1008 21:00:35.450197 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb75f170-ec61-4ea8-81c4-eb53321ee58f" path="/var/lib/kubelet/pods/eb75f170-ec61-4ea8-81c4-eb53321ee58f/volumes" Oct 08 21:00:35 crc kubenswrapper[4669]: I1008 21:00:35.548970 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e916b41d-d3e7-43b8-b550-8ad07a4cf147","Type":"ContainerStarted","Data":"6377dae22d3bace1bbfbc65dcad9981cb966ed31d7efbec9c6688d170788e859"} Oct 08 21:00:35 crc kubenswrapper[4669]: I1008 21:00:35.573674 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"416fb1ca-1386-410b-aaaa-3e0ec724d461","Type":"ContainerStarted","Data":"a01369d88176acd40aeadc5ae3fac42075655456be3361fb7c10334bb342c44a"} Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.309574 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.346882 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-574f4fd7c7-7zknn"] Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.378348 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.398662 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d68bc8f57-c6jv4"] Oct 08 21:00:37 crc kubenswrapper[4669]: E1008 21:00:37.399108 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb75f170-ec61-4ea8-81c4-eb53321ee58f" containerName="init" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.399130 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb75f170-ec61-4ea8-81c4-eb53321ee58f" containerName="init" Oct 08 21:00:37 crc kubenswrapper[4669]: E1008 21:00:37.399147 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58b2710-b460-48ba-9106-5928fbcff443" containerName="init" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.399153 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58b2710-b460-48ba-9106-5928fbcff443" containerName="init" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.399324 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58b2710-b460-48ba-9106-5928fbcff443" containerName="init" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.399361 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb75f170-ec61-4ea8-81c4-eb53321ee58f" containerName="init" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.400245 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.423297 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d68bc8f57-c6jv4"] Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.441670 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.516182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-config-data\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.516268 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-scripts\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.516316 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49676f30-cc4b-4229-b194-f28688f6da28-horizon-secret-key\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.516407 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49676f30-cc4b-4229-b194-f28688f6da28-logs\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.516452 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpnbd\" (UniqueName: \"kubernetes.io/projected/49676f30-cc4b-4229-b194-f28688f6da28-kube-api-access-kpnbd\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.617851 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49676f30-cc4b-4229-b194-f28688f6da28-logs\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.617945 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpnbd\" (UniqueName: \"kubernetes.io/projected/49676f30-cc4b-4229-b194-f28688f6da28-kube-api-access-kpnbd\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.617975 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-config-data\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.618029 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-scripts\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.618093 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49676f30-cc4b-4229-b194-f28688f6da28-horizon-secret-key\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.618389 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49676f30-cc4b-4229-b194-f28688f6da28-logs\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.618990 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-scripts\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.619428 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-config-data\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.624993 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49676f30-cc4b-4229-b194-f28688f6da28-horizon-secret-key\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.635922 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpnbd\" (UniqueName: \"kubernetes.io/projected/49676f30-cc4b-4229-b194-f28688f6da28-kube-api-access-kpnbd\") pod \"horizon-6d68bc8f57-c6jv4\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:37 crc kubenswrapper[4669]: I1008 21:00:37.730016 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.551357 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-fb17-account-create-rc2jm"] Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.553482 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.556957 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.566367 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fb17-account-create-rc2jm"] Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.647593 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3920-account-create-sw6j7"] Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.648950 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.650443 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.655280 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3920-account-create-sw6j7"] Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.736961 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tdd4\" (UniqueName: \"kubernetes.io/projected/3dfca695-7bff-4cb6-abdf-6e20eb14485e-kube-api-access-9tdd4\") pod \"cinder-3920-account-create-sw6j7\" (UID: \"3dfca695-7bff-4cb6-abdf-6e20eb14485e\") " pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.736997 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttdq\" (UniqueName: \"kubernetes.io/projected/daca45a8-2ae8-4b87-9fc4-348099b37165-kube-api-access-qttdq\") pod \"barbican-fb17-account-create-rc2jm\" (UID: \"daca45a8-2ae8-4b87-9fc4-348099b37165\") " pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.838789 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tdd4\" (UniqueName: \"kubernetes.io/projected/3dfca695-7bff-4cb6-abdf-6e20eb14485e-kube-api-access-9tdd4\") pod \"cinder-3920-account-create-sw6j7\" (UID: \"3dfca695-7bff-4cb6-abdf-6e20eb14485e\") " pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.838836 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttdq\" (UniqueName: \"kubernetes.io/projected/daca45a8-2ae8-4b87-9fc4-348099b37165-kube-api-access-qttdq\") pod \"barbican-fb17-account-create-rc2jm\" (UID: \"daca45a8-2ae8-4b87-9fc4-348099b37165\") " pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.861496 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttdq\" (UniqueName: \"kubernetes.io/projected/daca45a8-2ae8-4b87-9fc4-348099b37165-kube-api-access-qttdq\") pod \"barbican-fb17-account-create-rc2jm\" (UID: \"daca45a8-2ae8-4b87-9fc4-348099b37165\") " pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.863498 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tdd4\" (UniqueName: \"kubernetes.io/projected/3dfca695-7bff-4cb6-abdf-6e20eb14485e-kube-api-access-9tdd4\") pod \"cinder-3920-account-create-sw6j7\" (UID: \"3dfca695-7bff-4cb6-abdf-6e20eb14485e\") " pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.875644 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.951914 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-82f4-account-create-z8lzz"] Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.953173 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.955244 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.977303 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:38 crc kubenswrapper[4669]: I1008 21:00:38.988920 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-82f4-account-create-z8lzz"] Oct 08 21:00:39 crc kubenswrapper[4669]: I1008 21:00:39.042605 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fmz7\" (UniqueName: \"kubernetes.io/projected/af60e370-8287-408b-af8c-fc0d5c19e37d-kube-api-access-2fmz7\") pod \"neutron-82f4-account-create-z8lzz\" (UID: \"af60e370-8287-408b-af8c-fc0d5c19e37d\") " pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:39 crc kubenswrapper[4669]: I1008 21:00:39.144361 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fmz7\" (UniqueName: \"kubernetes.io/projected/af60e370-8287-408b-af8c-fc0d5c19e37d-kube-api-access-2fmz7\") pod \"neutron-82f4-account-create-z8lzz\" (UID: \"af60e370-8287-408b-af8c-fc0d5c19e37d\") " pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:39 crc kubenswrapper[4669]: I1008 21:00:39.163436 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fmz7\" (UniqueName: \"kubernetes.io/projected/af60e370-8287-408b-af8c-fc0d5c19e37d-kube-api-access-2fmz7\") pod \"neutron-82f4-account-create-z8lzz\" (UID: \"af60e370-8287-408b-af8c-fc0d5c19e37d\") " pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:39 crc kubenswrapper[4669]: I1008 21:00:39.272349 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:39 crc kubenswrapper[4669]: I1008 21:00:39.492786 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3920-account-create-sw6j7"] Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:39.604747 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3920-account-create-sw6j7" event={"ID":"3dfca695-7bff-4cb6-abdf-6e20eb14485e","Type":"ContainerStarted","Data":"ce0656efa291ec21994dc2d10c0d8ebac79b0a288e5f3773ed25e719b89265a5"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:39.609035 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" event={"ID":"c019d0a1-4d0f-40e8-8720-f20b74d33b4b","Type":"ContainerStarted","Data":"3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:39.627410 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fb17-account-create-rc2jm"] Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:39.744318 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d68bc8f57-c6jv4"] Oct 08 21:00:42 crc kubenswrapper[4669]: W1008 21:00:39.763538 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49676f30_cc4b_4229_b194_f28688f6da28.slice/crio-303a41634ffa2f8ee63d0298844f10defdbafcd62092a5573811d4e62796c1b6 WatchSource:0}: Error finding container 303a41634ffa2f8ee63d0298844f10defdbafcd62092a5573811d4e62796c1b6: Status 404 returned error can't find the container with id 303a41634ffa2f8ee63d0298844f10defdbafcd62092a5573811d4e62796c1b6 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:40.618655 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d68bc8f57-c6jv4" event={"ID":"49676f30-cc4b-4229-b194-f28688f6da28","Type":"ContainerStarted","Data":"303a41634ffa2f8ee63d0298844f10defdbafcd62092a5573811d4e62796c1b6"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:40.620427 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fb17-account-create-rc2jm" event={"ID":"daca45a8-2ae8-4b87-9fc4-348099b37165","Type":"ContainerStarted","Data":"d6eb62fdbc166e2ef3a53900a10ac4c9807ec3435a97c74e8c705dc557622dfa"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.634840 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"416fb1ca-1386-410b-aaaa-3e0ec724d461","Type":"ContainerStarted","Data":"d0f0c289cf419d449fc31e84ea59543864b20a81443703c1a426746b965e2087"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.634954 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-log" containerID="cri-o://a01369d88176acd40aeadc5ae3fac42075655456be3361fb7c10334bb342c44a" gracePeriod=30 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.634979 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-httpd" containerID="cri-o://d0f0c289cf419d449fc31e84ea59543864b20a81443703c1a426746b965e2087" gracePeriod=30 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.640600 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e916b41d-d3e7-43b8-b550-8ad07a4cf147","Type":"ContainerStarted","Data":"85aa18c4103edda5e2844754f22a43f2b0793896677491c5cc684acbb39ade80"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.640770 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-log" containerID="cri-o://6377dae22d3bace1bbfbc65dcad9981cb966ed31d7efbec9c6688d170788e859" gracePeriod=30 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.640829 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-httpd" containerID="cri-o://85aa18c4103edda5e2844754f22a43f2b0793896677491c5cc684acbb39ade80" gracePeriod=30 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.644113 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3920-account-create-sw6j7" event={"ID":"3dfca695-7bff-4cb6-abdf-6e20eb14485e","Type":"ContainerStarted","Data":"ec1aa2fc9ef7f9a3d5ec48f37c550f4cb060218d8d5d3d957a51d9e554fa68c0"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.646679 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fb17-account-create-rc2jm" event={"ID":"daca45a8-2ae8-4b87-9fc4-348099b37165","Type":"ContainerStarted","Data":"77ea3d9d9939f0e2434182cfc13c1f6fa00dc35709c6c5503958896ff06a05c5"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.646719 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.665141 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.665115989 podStartE2EDuration="9.665115989s" podCreationTimestamp="2025-10-08 21:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:41.65788772 +0000 UTC m=+961.350698403" watchObservedRunningTime="2025-10-08 21:00:41.665115989 +0000 UTC m=+961.357926682" Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.684971 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" podStartSLOduration=9.684956505 podStartE2EDuration="9.684956505s" podCreationTimestamp="2025-10-08 21:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:41.67820989 +0000 UTC m=+961.371020573" watchObservedRunningTime="2025-10-08 21:00:41.684956505 +0000 UTC m=+961.377767178" Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.700430 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-fb17-account-create-rc2jm" podStartSLOduration=3.700414462 podStartE2EDuration="3.700414462s" podCreationTimestamp="2025-10-08 21:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:41.693311626 +0000 UTC m=+961.386122299" watchObservedRunningTime="2025-10-08 21:00:41.700414462 +0000 UTC m=+961.393225135" Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.720928 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.720907667 podStartE2EDuration="9.720907667s" podCreationTimestamp="2025-10-08 21:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:41.712403262 +0000 UTC m=+961.405213935" watchObservedRunningTime="2025-10-08 21:00:41.720907667 +0000 UTC m=+961.413718340" Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:41.740696 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3920-account-create-sw6j7" podStartSLOduration=3.740670051 podStartE2EDuration="3.740670051s" podCreationTimestamp="2025-10-08 21:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:41.727339794 +0000 UTC m=+961.420150457" watchObservedRunningTime="2025-10-08 21:00:41.740670051 +0000 UTC m=+961.433480724" Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.555863 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-82f4-account-create-z8lzz"] Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.656214 4669 generic.go:334] "Generic (PLEG): container finished" podID="daca45a8-2ae8-4b87-9fc4-348099b37165" containerID="77ea3d9d9939f0e2434182cfc13c1f6fa00dc35709c6c5503958896ff06a05c5" exitCode=0 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.656291 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fb17-account-create-rc2jm" event={"ID":"daca45a8-2ae8-4b87-9fc4-348099b37165","Type":"ContainerDied","Data":"77ea3d9d9939f0e2434182cfc13c1f6fa00dc35709c6c5503958896ff06a05c5"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.661180 4669 generic.go:334] "Generic (PLEG): container finished" podID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerID="d0f0c289cf419d449fc31e84ea59543864b20a81443703c1a426746b965e2087" exitCode=0 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.661212 4669 generic.go:334] "Generic (PLEG): container finished" podID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerID="a01369d88176acd40aeadc5ae3fac42075655456be3361fb7c10334bb342c44a" exitCode=143 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.661278 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"416fb1ca-1386-410b-aaaa-3e0ec724d461","Type":"ContainerDied","Data":"d0f0c289cf419d449fc31e84ea59543864b20a81443703c1a426746b965e2087"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.661312 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"416fb1ca-1386-410b-aaaa-3e0ec724d461","Type":"ContainerDied","Data":"a01369d88176acd40aeadc5ae3fac42075655456be3361fb7c10334bb342c44a"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.664157 4669 generic.go:334] "Generic (PLEG): container finished" podID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerID="85aa18c4103edda5e2844754f22a43f2b0793896677491c5cc684acbb39ade80" exitCode=0 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.664186 4669 generic.go:334] "Generic (PLEG): container finished" podID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerID="6377dae22d3bace1bbfbc65dcad9981cb966ed31d7efbec9c6688d170788e859" exitCode=143 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.664212 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e916b41d-d3e7-43b8-b550-8ad07a4cf147","Type":"ContainerDied","Data":"85aa18c4103edda5e2844754f22a43f2b0793896677491c5cc684acbb39ade80"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.664262 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e916b41d-d3e7-43b8-b550-8ad07a4cf147","Type":"ContainerDied","Data":"6377dae22d3bace1bbfbc65dcad9981cb966ed31d7efbec9c6688d170788e859"} Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.666539 4669 generic.go:334] "Generic (PLEG): container finished" podID="3dfca695-7bff-4cb6-abdf-6e20eb14485e" containerID="ec1aa2fc9ef7f9a3d5ec48f37c550f4cb060218d8d5d3d957a51d9e554fa68c0" exitCode=0 Oct 08 21:00:42 crc kubenswrapper[4669]: I1008 21:00:42.666678 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3920-account-create-sw6j7" event={"ID":"3dfca695-7bff-4cb6-abdf-6e20eb14485e","Type":"ContainerDied","Data":"ec1aa2fc9ef7f9a3d5ec48f37c550f4cb060218d8d5d3d957a51d9e554fa68c0"} Oct 08 21:00:43 crc kubenswrapper[4669]: I1008 21:00:43.677312 4669 generic.go:334] "Generic (PLEG): container finished" podID="b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" containerID="0df0a205510b9f2322e2df5512943b74d26c823c6537d133620954f20dce5aaa" exitCode=0 Oct 08 21:00:43 crc kubenswrapper[4669]: I1008 21:00:43.677400 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rxss2" event={"ID":"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f","Type":"ContainerDied","Data":"0df0a205510b9f2322e2df5512943b74d26c823c6537d133620954f20dce5aaa"} Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.038941 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55cd7b4cc7-2w7vd"] Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.077612 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cd548c4f4-74w5s"] Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.080031 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.083107 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.093548 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cd548c4f4-74w5s"] Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.157423 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d68bc8f57-c6jv4"] Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.199178 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fc57f5668-z5dzm"] Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.200652 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.211791 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fc57f5668-z5dzm"] Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237082 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-secret-key\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237226 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-scripts\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237260 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-combined-ca-bundle\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237501 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlgkr\" (UniqueName: \"kubernetes.io/projected/f67695c6-cc78-4e93-86e4-34b030405e0e-kube-api-access-jlgkr\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237604 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67695c6-cc78-4e93-86e4-34b030405e0e-logs\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.237666 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-tls-certs\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.338790 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e0f642-1a58-481b-8347-b4d29176ddc5-logs\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.338832 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-combined-ca-bundle\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.338857 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-horizon-tls-certs\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.338877 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43e0f642-1a58-481b-8347-b4d29176ddc5-scripts\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.338917 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43e0f642-1a58-481b-8347-b4d29176ddc5-config-data\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.338943 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.339145 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-secret-key\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.339331 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-scripts\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.340003 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.340233 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-scripts\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.340345 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-horizon-secret-key\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.340407 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5xt6\" (UniqueName: \"kubernetes.io/projected/43e0f642-1a58-481b-8347-b4d29176ddc5-kube-api-access-c5xt6\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.340578 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-combined-ca-bundle\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.341092 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlgkr\" (UniqueName: \"kubernetes.io/projected/f67695c6-cc78-4e93-86e4-34b030405e0e-kube-api-access-jlgkr\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.341205 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67695c6-cc78-4e93-86e4-34b030405e0e-logs\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.341257 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-tls-certs\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.341583 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67695c6-cc78-4e93-86e4-34b030405e0e-logs\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.344415 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-tls-certs\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.344647 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-secret-key\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.354046 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-combined-ca-bundle\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.366353 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlgkr\" (UniqueName: \"kubernetes.io/projected/f67695c6-cc78-4e93-86e4-34b030405e0e-kube-api-access-jlgkr\") pod \"horizon-6cd548c4f4-74w5s\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.404691 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.442917 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-horizon-secret-key\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.442985 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5xt6\" (UniqueName: \"kubernetes.io/projected/43e0f642-1a58-481b-8347-b4d29176ddc5-kube-api-access-c5xt6\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.443068 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e0f642-1a58-481b-8347-b4d29176ddc5-logs\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.443086 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-combined-ca-bundle\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.443107 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-horizon-tls-certs\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.443128 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43e0f642-1a58-481b-8347-b4d29176ddc5-scripts\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.443142 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43e0f642-1a58-481b-8347-b4d29176ddc5-config-data\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.444261 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43e0f642-1a58-481b-8347-b4d29176ddc5-config-data\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.445463 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43e0f642-1a58-481b-8347-b4d29176ddc5-scripts\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.445551 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e0f642-1a58-481b-8347-b4d29176ddc5-logs\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.447785 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-horizon-secret-key\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.448765 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-combined-ca-bundle\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.450726 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e0f642-1a58-481b-8347-b4d29176ddc5-horizon-tls-certs\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.463433 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5xt6\" (UniqueName: \"kubernetes.io/projected/43e0f642-1a58-481b-8347-b4d29176ddc5-kube-api-access-c5xt6\") pod \"horizon-5fc57f5668-z5dzm\" (UID: \"43e0f642-1a58-481b-8347-b4d29176ddc5\") " pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:44 crc kubenswrapper[4669]: I1008 21:00:44.522183 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:47 crc kubenswrapper[4669]: I1008 21:00:47.791737 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:00:47 crc kubenswrapper[4669]: I1008 21:00:47.841821 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzpfw"] Oct 08 21:00:47 crc kubenswrapper[4669]: I1008 21:00:47.842425 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="dnsmasq-dns" containerID="cri-o://53d08d1ee28cd37c456c3a05a6d5b23f416740de7750be9af47c44141d09a776" gracePeriod=10 Oct 08 21:00:48 crc kubenswrapper[4669]: I1008 21:00:48.738287 4669 generic.go:334] "Generic (PLEG): container finished" podID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerID="53d08d1ee28cd37c456c3a05a6d5b23f416740de7750be9af47c44141d09a776" exitCode=0 Oct 08 21:00:48 crc kubenswrapper[4669]: I1008 21:00:48.738323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" event={"ID":"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd","Type":"ContainerDied","Data":"53d08d1ee28cd37c456c3a05a6d5b23f416740de7750be9af47c44141d09a776"} Oct 08 21:00:49 crc kubenswrapper[4669]: E1008 21:00:49.295776 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 08 21:00:49 crc kubenswrapper[4669]: E1008 21:00:49.295942 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697hc9h66h595hb7h5fdh564h74h5d6h649h5b5h557h65ch56h7chc6hbfh58bhdfhc7h65dh5cch86h97h59bh84h589h5bfh5dbhd9h55bh5b6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8npbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-574f4fd7c7-7zknn_openstack(79cb9326-1639-44ff-973d-adbce6c8098b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 21:00:49 crc kubenswrapper[4669]: E1008 21:00:49.299993 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-574f4fd7c7-7zknn" podUID="79cb9326-1639-44ff-973d-adbce6c8098b" Oct 08 21:00:49 crc kubenswrapper[4669]: I1008 21:00:49.549986 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Oct 08 21:00:50 crc kubenswrapper[4669]: E1008 21:00:50.887202 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 08 21:00:50 crc kubenswrapper[4669]: E1008 21:00:50.887420 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cvjx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-p2bl6_openstack(469707ec-e817-4d42-b406-6595799f6036): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 21:00:50 crc kubenswrapper[4669]: E1008 21:00:50.888841 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-p2bl6" podUID="469707ec-e817-4d42-b406-6595799f6036" Oct 08 21:00:50 crc kubenswrapper[4669]: E1008 21:00:50.922846 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 08 21:00:50 crc kubenswrapper[4669]: E1008 21:00:50.923102 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64bhd8h57dh65bh688hdfh66h696h55chb8h698h55dhfdh598h59ch9bh97h665hf6h5ddh594h5c9h5b4h658h59bh658hf5h77h64h659h694h55q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wqx5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55cd7b4cc7-2w7vd_openstack(9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 21:00:50 crc kubenswrapper[4669]: E1008 21:00:50.925628 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-55cd7b4cc7-2w7vd" podUID="9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" Oct 08 21:00:50 crc kubenswrapper[4669]: W1008 21:00:50.954307 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf60e370_8287_408b_af8c_fc0d5c19e37d.slice/crio-4a662a6b3a1dbd50857abdd16528143f49691a06bc6b3ec8c07df54a98f5964b WatchSource:0}: Error finding container 4a662a6b3a1dbd50857abdd16528143f49691a06bc6b3ec8c07df54a98f5964b: Status 404 returned error can't find the container with id 4a662a6b3a1dbd50857abdd16528143f49691a06bc6b3ec8c07df54a98f5964b Oct 08 21:00:50 crc kubenswrapper[4669]: I1008 21:00:50.966631 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.211806 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.230011 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.256646 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.269791 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.382977 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8npbw\" (UniqueName: \"kubernetes.io/projected/79cb9326-1639-44ff-973d-adbce6c8098b-kube-api-access-8npbw\") pod \"79cb9326-1639-44ff-973d-adbce6c8098b\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383018 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cb9326-1639-44ff-973d-adbce6c8098b-logs\") pod \"79cb9326-1639-44ff-973d-adbce6c8098b\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383057 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-combined-ca-bundle\") pod \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383078 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-config-data\") pod \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383177 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qttdq\" (UniqueName: \"kubernetes.io/projected/daca45a8-2ae8-4b87-9fc4-348099b37165-kube-api-access-qttdq\") pod \"daca45a8-2ae8-4b87-9fc4-348099b37165\" (UID: \"daca45a8-2ae8-4b87-9fc4-348099b37165\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383225 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-credential-keys\") pod \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383273 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-config-data\") pod \"79cb9326-1639-44ff-973d-adbce6c8098b\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383303 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79cb9326-1639-44ff-973d-adbce6c8098b-horizon-secret-key\") pod \"79cb9326-1639-44ff-973d-adbce6c8098b\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383325 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgwjt\" (UniqueName: \"kubernetes.io/projected/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-kube-api-access-bgwjt\") pod \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383354 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-fernet-keys\") pod \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383403 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tdd4\" (UniqueName: \"kubernetes.io/projected/3dfca695-7bff-4cb6-abdf-6e20eb14485e-kube-api-access-9tdd4\") pod \"3dfca695-7bff-4cb6-abdf-6e20eb14485e\" (UID: \"3dfca695-7bff-4cb6-abdf-6e20eb14485e\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383423 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-scripts\") pod \"79cb9326-1639-44ff-973d-adbce6c8098b\" (UID: \"79cb9326-1639-44ff-973d-adbce6c8098b\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.383447 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-scripts\") pod \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\" (UID: \"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.387186 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-config-data" (OuterVolumeSpecName: "config-data") pod "79cb9326-1639-44ff-973d-adbce6c8098b" (UID: "79cb9326-1639-44ff-973d-adbce6c8098b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.387903 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-scripts" (OuterVolumeSpecName: "scripts") pod "79cb9326-1639-44ff-973d-adbce6c8098b" (UID: "79cb9326-1639-44ff-973d-adbce6c8098b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.388434 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79cb9326-1639-44ff-973d-adbce6c8098b-logs" (OuterVolumeSpecName: "logs") pod "79cb9326-1639-44ff-973d-adbce6c8098b" (UID: "79cb9326-1639-44ff-973d-adbce6c8098b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.389930 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" (UID: "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.390482 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-scripts" (OuterVolumeSpecName: "scripts") pod "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" (UID: "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.390745 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cb9326-1639-44ff-973d-adbce6c8098b-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.390767 4669 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.390779 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.390808 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79cb9326-1639-44ff-973d-adbce6c8098b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.390816 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.392468 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-kube-api-access-bgwjt" (OuterVolumeSpecName: "kube-api-access-bgwjt") pod "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" (UID: "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f"). InnerVolumeSpecName "kube-api-access-bgwjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.393442 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daca45a8-2ae8-4b87-9fc4-348099b37165-kube-api-access-qttdq" (OuterVolumeSpecName: "kube-api-access-qttdq") pod "daca45a8-2ae8-4b87-9fc4-348099b37165" (UID: "daca45a8-2ae8-4b87-9fc4-348099b37165"). InnerVolumeSpecName "kube-api-access-qttdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.394132 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" (UID: "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.396718 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79cb9326-1639-44ff-973d-adbce6c8098b-kube-api-access-8npbw" (OuterVolumeSpecName: "kube-api-access-8npbw") pod "79cb9326-1639-44ff-973d-adbce6c8098b" (UID: "79cb9326-1639-44ff-973d-adbce6c8098b"). InnerVolumeSpecName "kube-api-access-8npbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.397331 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dfca695-7bff-4cb6-abdf-6e20eb14485e-kube-api-access-9tdd4" (OuterVolumeSpecName: "kube-api-access-9tdd4") pod "3dfca695-7bff-4cb6-abdf-6e20eb14485e" (UID: "3dfca695-7bff-4cb6-abdf-6e20eb14485e"). InnerVolumeSpecName "kube-api-access-9tdd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.400013 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79cb9326-1639-44ff-973d-adbce6c8098b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "79cb9326-1639-44ff-973d-adbce6c8098b" (UID: "79cb9326-1639-44ff-973d-adbce6c8098b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.422268 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-config-data" (OuterVolumeSpecName: "config-data") pod "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" (UID: "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.430846 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" (UID: "b94dc5a1-30ee-4ce3-807d-51f22cb32b5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492059 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79cb9326-1639-44ff-973d-adbce6c8098b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492088 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgwjt\" (UniqueName: \"kubernetes.io/projected/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-kube-api-access-bgwjt\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492098 4669 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492106 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tdd4\" (UniqueName: \"kubernetes.io/projected/3dfca695-7bff-4cb6-abdf-6e20eb14485e-kube-api-access-9tdd4\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492115 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8npbw\" (UniqueName: \"kubernetes.io/projected/79cb9326-1639-44ff-973d-adbce6c8098b-kube-api-access-8npbw\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492123 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492131 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.492139 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qttdq\" (UniqueName: \"kubernetes.io/projected/daca45a8-2ae8-4b87-9fc4-348099b37165-kube-api-access-qttdq\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.526596 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.543770 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.683868 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cd548c4f4-74w5s"] Oct 08 21:00:51 crc kubenswrapper[4669]: W1008 21:00:51.685751 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67695c6_cc78_4e93_86e4_34b030405e0e.slice/crio-ec0702da548413ceeefd10476549f8dbe3ddf463ed7e46916de59b919f286070 WatchSource:0}: Error finding container ec0702da548413ceeefd10476549f8dbe3ddf463ed7e46916de59b919f286070: Status 404 returned error can't find the container with id ec0702da548413ceeefd10476549f8dbe3ddf463ed7e46916de59b919f286070 Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695142 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-nb\") pod \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695197 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-sb\") pod \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695286 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-svc\") pod \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695312 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-logs\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695554 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-config-data\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695584 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49gd\" (UniqueName: \"kubernetes.io/projected/e916b41d-d3e7-43b8-b550-8ad07a4cf147-kube-api-access-b49gd\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695641 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-swift-storage-0\") pod \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695663 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-scripts\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695704 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-combined-ca-bundle\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695741 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-config\") pod \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695861 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fmx\" (UniqueName: \"kubernetes.io/projected/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-kube-api-access-79fmx\") pod \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\" (UID: \"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695884 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.695929 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-httpd-run\") pod \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\" (UID: \"e916b41d-d3e7-43b8-b550-8ad07a4cf147\") " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.696798 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.697127 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-logs" (OuterVolumeSpecName: "logs") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.703655 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fc57f5668-z5dzm"] Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.704244 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e916b41d-d3e7-43b8-b550-8ad07a4cf147-kube-api-access-b49gd" (OuterVolumeSpecName: "kube-api-access-b49gd") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "kube-api-access-b49gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.707070 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-kube-api-access-79fmx" (OuterVolumeSpecName: "kube-api-access-79fmx") pod "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" (UID: "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd"). InnerVolumeSpecName "kube-api-access-79fmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.707387 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-scripts" (OuterVolumeSpecName: "scripts") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.711062 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: W1008 21:00:51.725309 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e0f642_1a58_481b_8347_b4d29176ddc5.slice/crio-8ca40ff83ecb82de9fd162da610010ff028a6645b942535210327dbab657d1b2 WatchSource:0}: Error finding container 8ca40ff83ecb82de9fd162da610010ff028a6645b942535210327dbab657d1b2: Status 404 returned error can't find the container with id 8ca40ff83ecb82de9fd162da610010ff028a6645b942535210327dbab657d1b2 Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.764171 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" (UID: "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.768648 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd548c4f4-74w5s" event={"ID":"f67695c6-cc78-4e93-86e4-34b030405e0e","Type":"ContainerStarted","Data":"ec0702da548413ceeefd10476549f8dbe3ddf463ed7e46916de59b919f286070"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.769872 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rxss2" event={"ID":"b94dc5a1-30ee-4ce3-807d-51f22cb32b5f","Type":"ContainerDied","Data":"5e1532bd10db2990fd2e46ab57a43e54a7c919360ac302dd409ec148b306ed35"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.769946 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e1532bd10db2990fd2e46ab57a43e54a7c919360ac302dd409ec148b306ed35" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.770109 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rxss2" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.775939 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fb17-account-create-rc2jm" event={"ID":"daca45a8-2ae8-4b87-9fc4-348099b37165","Type":"ContainerDied","Data":"d6eb62fdbc166e2ef3a53900a10ac4c9807ec3435a97c74e8c705dc557622dfa"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.776036 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6eb62fdbc166e2ef3a53900a10ac4c9807ec3435a97c74e8c705dc557622dfa" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.776163 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fb17-account-create-rc2jm" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.785939 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574f4fd7c7-7zknn" event={"ID":"79cb9326-1639-44ff-973d-adbce6c8098b","Type":"ContainerDied","Data":"059abaefb495ce0badb0de442e1bc90b77428a6ad8a1e6e32cb7659c41eb8f3b"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.786036 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574f4fd7c7-7zknn" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.792959 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d68bc8f57-c6jv4" event={"ID":"49676f30-cc4b-4229-b194-f28688f6da28","Type":"ContainerStarted","Data":"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.794794 4669 generic.go:334] "Generic (PLEG): container finished" podID="af60e370-8287-408b-af8c-fc0d5c19e37d" containerID="6b78ce690d29ad48209f564e7a610ebed2a44fa6659b303692d0d9baf9cca697" exitCode=0 Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.795204 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-82f4-account-create-z8lzz" event={"ID":"af60e370-8287-408b-af8c-fc0d5c19e37d","Type":"ContainerDied","Data":"6b78ce690d29ad48209f564e7a610ebed2a44fa6659b303692d0d9baf9cca697"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.795292 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-82f4-account-create-z8lzz" event={"ID":"af60e370-8287-408b-af8c-fc0d5c19e37d","Type":"ContainerStarted","Data":"4a662a6b3a1dbd50857abdd16528143f49691a06bc6b3ec8c07df54a98f5964b"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797574 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fmx\" (UniqueName: \"kubernetes.io/projected/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-kube-api-access-79fmx\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797618 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797630 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797641 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797652 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e916b41d-d3e7-43b8-b550-8ad07a4cf147-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797663 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49gd\" (UniqueName: \"kubernetes.io/projected/e916b41d-d3e7-43b8-b550-8ad07a4cf147-kube-api-access-b49gd\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.797694 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.800619 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e916b41d-d3e7-43b8-b550-8ad07a4cf147","Type":"ContainerDied","Data":"d2b98603e2c3e0b7fb624d6dc3aa3b476b121f2153b3d6e061b3ccb5a879687c"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.800710 4669 scope.go:117] "RemoveContainer" containerID="85aa18c4103edda5e2844754f22a43f2b0793896677491c5cc684acbb39ade80" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.800854 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.811448 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.811800 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" event={"ID":"9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd","Type":"ContainerDied","Data":"97f560f4b3c8f7de8efac111ae52a97440ecba29a29581272d7fdfe3885b2836"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.811599 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzpfw" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.817190 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3920-account-create-sw6j7" event={"ID":"3dfca695-7bff-4cb6-abdf-6e20eb14485e","Type":"ContainerDied","Data":"ce0656efa291ec21994dc2d10c0d8ebac79b0a288e5f3773ed25e719b89265a5"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.817237 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce0656efa291ec21994dc2d10c0d8ebac79b0a288e5f3773ed25e719b89265a5" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.817313 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3920-account-create-sw6j7" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.822295 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" (UID: "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.827861 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerStarted","Data":"9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.828698 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" (UID: "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.829305 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-config" (OuterVolumeSpecName: "config") pod "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" (UID: "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.829656 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc57f5668-z5dzm" event={"ID":"43e0f642-1a58-481b-8347-b4d29176ddc5","Type":"ContainerStarted","Data":"8ca40ff83ecb82de9fd162da610010ff028a6645b942535210327dbab657d1b2"} Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.840974 4669 scope.go:117] "RemoveContainer" containerID="6377dae22d3bace1bbfbc65dcad9981cb966ed31d7efbec9c6688d170788e859" Oct 08 21:00:51 crc kubenswrapper[4669]: E1008 21:00:51.841121 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-p2bl6" podUID="469707ec-e817-4d42-b406-6595799f6036" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.868340 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" (UID: "9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.869764 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.876033 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-config-data" (OuterVolumeSpecName: "config-data") pod "e916b41d-d3e7-43b8-b550-8ad07a4cf147" (UID: "e916b41d-d3e7-43b8-b550-8ad07a4cf147"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.898384 4669 scope.go:117] "RemoveContainer" containerID="53d08d1ee28cd37c456c3a05a6d5b23f416740de7750be9af47c44141d09a776" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899781 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899810 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899820 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899829 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899836 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899845 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e916b41d-d3e7-43b8-b550-8ad07a4cf147-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.899853 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.900317 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-574f4fd7c7-7zknn"] Oct 08 21:00:51 crc kubenswrapper[4669]: I1008 21:00:51.917429 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-574f4fd7c7-7zknn"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.123704 4669 scope.go:117] "RemoveContainer" containerID="d18124ed67ebe9f4815fec07af3a833b66fabfd7ad40ea4dbb14920b53fc8dbf" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.310515 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.328783 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.340277 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.361462 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376332 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376695 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dfca695-7bff-4cb6-abdf-6e20eb14485e" containerName="mariadb-account-create" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376712 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfca695-7bff-4cb6-abdf-6e20eb14485e" containerName="mariadb-account-create" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376761 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-log" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376770 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-log" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376781 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-httpd" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376788 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-httpd" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376799 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-httpd" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376805 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-httpd" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376821 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="dnsmasq-dns" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376826 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="dnsmasq-dns" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376837 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca45a8-2ae8-4b87-9fc4-348099b37165" containerName="mariadb-account-create" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376852 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca45a8-2ae8-4b87-9fc4-348099b37165" containerName="mariadb-account-create" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376864 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="init" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376869 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="init" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376880 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-log" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376885 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-log" Oct 08 21:00:52 crc kubenswrapper[4669]: E1008 21:00:52.376898 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" containerName="keystone-bootstrap" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.376903 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" containerName="keystone-bootstrap" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380635 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-log" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380658 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" containerName="dnsmasq-dns" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380676 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dfca695-7bff-4cb6-abdf-6e20eb14485e" containerName="mariadb-account-create" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380690 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="daca45a8-2ae8-4b87-9fc4-348099b37165" containerName="mariadb-account-create" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380711 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" containerName="keystone-bootstrap" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380729 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-log" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380741 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" containerName="glance-httpd" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.380755 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" containerName="glance-httpd" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.381737 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.385497 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.387706 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.409179 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqx5r\" (UniqueName: \"kubernetes.io/projected/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-kube-api-access-wqx5r\") pod \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.409889 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-scripts\") pod \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.410021 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-logs\") pod \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.410142 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-horizon-secret-key\") pod \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.410663 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-config-data\") pod \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\" (UID: \"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.411705 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-config-data" (OuterVolumeSpecName: "config-data") pod "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" (UID: "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.412182 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-logs" (OuterVolumeSpecName: "logs") pod "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" (UID: "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.415624 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-scripts" (OuterVolumeSpecName: "scripts") pod "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" (UID: "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.416643 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzpfw"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.428773 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.434677 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzpfw"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.451440 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rxss2"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.457505 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rxss2"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.472905 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-kube-api-access-wqx5r" (OuterVolumeSpecName: "kube-api-access-wqx5r") pod "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" (UID: "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d"). InnerVolumeSpecName "kube-api-access-wqx5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.473142 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" (UID: "9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.474122 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q5btg"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.475191 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.479745 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5btg"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.488021 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.488220 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.488376 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llg6k" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.488876 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.511732 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-httpd-run\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.511856 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-config-data\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.511892 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z89xm\" (UniqueName: \"kubernetes.io/projected/416fb1ca-1386-410b-aaaa-3e0ec724d461-kube-api-access-z89xm\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.511948 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512114 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-combined-ca-bundle\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-scripts\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-logs\") pod \"416fb1ca-1386-410b-aaaa-3e0ec724d461\" (UID: \"416fb1ca-1386-410b-aaaa-3e0ec724d461\") " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512372 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512420 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512471 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512495 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512521 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-logs\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512825 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.512873 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513250 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513398 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-logs" (OuterVolumeSpecName: "logs") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513747 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfsdm\" (UniqueName: \"kubernetes.io/projected/1c34a743-b5df-4b29-847f-521f7086fa81-kube-api-access-tfsdm\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513898 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513910 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513937 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/416fb1ca-1386-410b-aaaa-3e0ec724d461-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513948 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqx5r\" (UniqueName: \"kubernetes.io/projected/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-kube-api-access-wqx5r\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513978 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513987 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.513997 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.527276 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416fb1ca-1386-410b-aaaa-3e0ec724d461-kube-api-access-z89xm" (OuterVolumeSpecName: "kube-api-access-z89xm") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "kube-api-access-z89xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.527338 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.528176 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-scripts" (OuterVolumeSpecName: "scripts") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.607037 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616114 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-config-data\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616207 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-logs\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616231 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616292 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616314 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfsdm\" (UniqueName: \"kubernetes.io/projected/1c34a743-b5df-4b29-847f-521f7086fa81-kube-api-access-tfsdm\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616343 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-scripts\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616398 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-fernet-keys\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-combined-ca-bundle\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616466 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616497 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616541 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgh8n\" (UniqueName: \"kubernetes.io/projected/42ef52de-f02e-447a-8713-16ce12443117-kube-api-access-bgh8n\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616586 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-credential-keys\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616624 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616694 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616706 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616718 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z89xm\" (UniqueName: \"kubernetes.io/projected/416fb1ca-1386-410b-aaaa-3e0ec724d461-kube-api-access-z89xm\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.616741 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.618303 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-logs\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.618690 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.624638 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.626117 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.634772 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.635885 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.650247 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.667354 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfsdm\" (UniqueName: \"kubernetes.io/projected/1c34a743-b5df-4b29-847f-521f7086fa81-kube-api-access-tfsdm\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.678909 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.698833 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-config-data" (OuterVolumeSpecName: "config-data") pod "416fb1ca-1386-410b-aaaa-3e0ec724d461" (UID: "416fb1ca-1386-410b-aaaa-3e0ec724d461"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.718410 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-credential-keys\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.718708 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-config-data\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.718814 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-scripts\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.718895 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-fernet-keys\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.718956 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-combined-ca-bundle\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.719035 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgh8n\" (UniqueName: \"kubernetes.io/projected/42ef52de-f02e-447a-8713-16ce12443117-kube-api-access-bgh8n\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.719122 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.719189 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416fb1ca-1386-410b-aaaa-3e0ec724d461-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.723996 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-scripts\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.724559 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-credential-keys\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.734678 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-config-data\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.740101 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-fernet-keys\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.742188 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-combined-ca-bundle\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.759834 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " pod="openstack/glance-default-external-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.760698 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgh8n\" (UniqueName: \"kubernetes.io/projected/42ef52de-f02e-447a-8713-16ce12443117-kube-api-access-bgh8n\") pod \"keystone-bootstrap-q5btg\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.802884 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.839512 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d68bc8f57-c6jv4" event={"ID":"49676f30-cc4b-4229-b194-f28688f6da28","Type":"ContainerStarted","Data":"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.839630 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d68bc8f57-c6jv4" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon-log" containerID="cri-o://a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0" gracePeriod=30 Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.839758 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d68bc8f57-c6jv4" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon" containerID="cri-o://cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234" gracePeriod=30 Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.842774 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cd7b4cc7-2w7vd" event={"ID":"9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d","Type":"ContainerDied","Data":"299206eb36cdea4f2b2fa5076bb7a9d61ed0a8314e4091f7f6abcdbb7d57f2e3"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.842892 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cd7b4cc7-2w7vd" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.845157 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc57f5668-z5dzm" event={"ID":"43e0f642-1a58-481b-8347-b4d29176ddc5","Type":"ContainerStarted","Data":"8726ebf6039950b2b7c2ea7d8004bccf32fe8b8e13482e764f7135b5ff25a02d"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.845198 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc57f5668-z5dzm" event={"ID":"43e0f642-1a58-481b-8347-b4d29176ddc5","Type":"ContainerStarted","Data":"c62178a5180444b8a528a6f26f758d45e1c252fc6452c62dad7366b6eefe52ea"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.859315 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"416fb1ca-1386-410b-aaaa-3e0ec724d461","Type":"ContainerDied","Data":"5aabb969d569cdd3b4d065ae60447d269b35b26148e46ba0240d38f951a385b4"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.859597 4669 scope.go:117] "RemoveContainer" containerID="d0f0c289cf419d449fc31e84ea59543864b20a81443703c1a426746b965e2087" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.859778 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.867916 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d68bc8f57-c6jv4" podStartSLOduration=4.449240408 podStartE2EDuration="15.86789569s" podCreationTimestamp="2025-10-08 21:00:37 +0000 UTC" firstStartedPulling="2025-10-08 21:00:39.766862701 +0000 UTC m=+959.459673374" lastFinishedPulling="2025-10-08 21:00:51.185517983 +0000 UTC m=+970.878328656" observedRunningTime="2025-10-08 21:00:52.864831435 +0000 UTC m=+972.557642128" watchObservedRunningTime="2025-10-08 21:00:52.86789569 +0000 UTC m=+972.560706363" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.876793 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd548c4f4-74w5s" event={"ID":"f67695c6-cc78-4e93-86e4-34b030405e0e","Type":"ContainerStarted","Data":"0b62a5077ce4a93242842e1e897e72bbe90ea93c59e2811a272696aad9a6ca8d"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.876838 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd548c4f4-74w5s" event={"ID":"f67695c6-cc78-4e93-86e4-34b030405e0e","Type":"ContainerStarted","Data":"c0ba3b7ee1bb2b135b87938968a5b52aa40dc067d0aa4c25b461634b322cef98"} Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.928409 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fc57f5668-z5dzm" podStartSLOduration=8.928389378 podStartE2EDuration="8.928389378s" podCreationTimestamp="2025-10-08 21:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:52.90309268 +0000 UTC m=+972.595903353" watchObservedRunningTime="2025-10-08 21:00:52.928389378 +0000 UTC m=+972.621200051" Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.959709 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55cd7b4cc7-2w7vd"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.972806 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55cd7b4cc7-2w7vd"] Oct 08 21:00:52 crc kubenswrapper[4669]: I1008 21:00:52.982612 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6cd548c4f4-74w5s" podStartSLOduration=8.977753609 podStartE2EDuration="8.977753609s" podCreationTimestamp="2025-10-08 21:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:52.97271383 +0000 UTC m=+972.665524503" watchObservedRunningTime="2025-10-08 21:00:52.977753609 +0000 UTC m=+972.670564282" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:52.998786 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.002218 4669 scope.go:117] "RemoveContainer" containerID="a01369d88176acd40aeadc5ae3fac42075655456be3361fb7c10334bb342c44a" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.002492 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.014653 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.059521 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.061215 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.069611 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.069840 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.090654 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128695 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128737 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128758 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128781 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128809 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128862 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128888 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.128920 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zxmq\" (UniqueName: \"kubernetes.io/projected/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-kube-api-access-5zxmq\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236477 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236565 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236599 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236625 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236661 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236734 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236767 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.236812 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zxmq\" (UniqueName: \"kubernetes.io/projected/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-kube-api-access-5zxmq\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.240742 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.241056 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.242245 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.244513 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.248751 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.249284 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.249401 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.311521 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zxmq\" (UniqueName: \"kubernetes.io/projected/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-kube-api-access-5zxmq\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.370473 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: W1008 21:00:53.370946 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42ef52de_f02e_447a_8713_16ce12443117.slice/crio-6b5f54fe2c0ed7c01a210402571e1fac72e196465ed3533d10cb2c2cd0262574 WatchSource:0}: Error finding container 6b5f54fe2c0ed7c01a210402571e1fac72e196465ed3533d10cb2c2cd0262574: Status 404 returned error can't find the container with id 6b5f54fe2c0ed7c01a210402571e1fac72e196465ed3533d10cb2c2cd0262574 Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.371587 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="416fb1ca-1386-410b-aaaa-3e0ec724d461" path="/var/lib/kubelet/pods/416fb1ca-1386-410b-aaaa-3e0ec724d461/volumes" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.374103 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79cb9326-1639-44ff-973d-adbce6c8098b" path="/var/lib/kubelet/pods/79cb9326-1639-44ff-973d-adbce6c8098b/volumes" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.375154 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd" path="/var/lib/kubelet/pods/9b8a0c57-b2ab-40f6-b1d3-2f69446ad0cd/volumes" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.377025 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d" path="/var/lib/kubelet/pods/9e3ead0b-4fdd-4e4d-8bd5-9e721bc1588d/volumes" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.377795 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94dc5a1-30ee-4ce3-807d-51f22cb32b5f" path="/var/lib/kubelet/pods/b94dc5a1-30ee-4ce3-807d-51f22cb32b5f/volumes" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.378677 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e916b41d-d3e7-43b8-b550-8ad07a4cf147" path="/var/lib/kubelet/pods/e916b41d-d3e7-43b8-b550-8ad07a4cf147/volumes" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.385999 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5btg"] Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.397946 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.409285 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.546766 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fmz7\" (UniqueName: \"kubernetes.io/projected/af60e370-8287-408b-af8c-fc0d5c19e37d-kube-api-access-2fmz7\") pod \"af60e370-8287-408b-af8c-fc0d5c19e37d\" (UID: \"af60e370-8287-408b-af8c-fc0d5c19e37d\") " Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.554233 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af60e370-8287-408b-af8c-fc0d5c19e37d-kube-api-access-2fmz7" (OuterVolumeSpecName: "kube-api-access-2fmz7") pod "af60e370-8287-408b-af8c-fc0d5c19e37d" (UID: "af60e370-8287-408b-af8c-fc0d5c19e37d"). InnerVolumeSpecName "kube-api-access-2fmz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.650866 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fmz7\" (UniqueName: \"kubernetes.io/projected/af60e370-8287-408b-af8c-fc0d5c19e37d-kube-api-access-2fmz7\") on node \"crc\" DevicePath \"\"" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.688699 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.903054 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5btg" event={"ID":"42ef52de-f02e-447a-8713-16ce12443117","Type":"ContainerStarted","Data":"8da2c877511a8683c6848823d437a4855d3eaa59ff371446d7dcbe07bb26d5cd"} Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.903105 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5btg" event={"ID":"42ef52de-f02e-447a-8713-16ce12443117","Type":"ContainerStarted","Data":"6b5f54fe2c0ed7c01a210402571e1fac72e196465ed3533d10cb2c2cd0262574"} Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.906917 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-82f4-account-create-z8lzz" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.907022 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-82f4-account-create-z8lzz" event={"ID":"af60e370-8287-408b-af8c-fc0d5c19e37d","Type":"ContainerDied","Data":"4a662a6b3a1dbd50857abdd16528143f49691a06bc6b3ec8c07df54a98f5964b"} Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.907058 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a662a6b3a1dbd50857abdd16528143f49691a06bc6b3ec8c07df54a98f5964b" Oct 08 21:00:53 crc kubenswrapper[4669]: I1008 21:00:53.928144 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q5btg" podStartSLOduration=1.928128048 podStartE2EDuration="1.928128048s" podCreationTimestamp="2025-10-08 21:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:53.924990092 +0000 UTC m=+973.617800765" watchObservedRunningTime="2025-10-08 21:00:53.928128048 +0000 UTC m=+973.620938721" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.020678 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-j5jf6"] Oct 08 21:00:54 crc kubenswrapper[4669]: E1008 21:00:54.021399 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af60e370-8287-408b-af8c-fc0d5c19e37d" containerName="mariadb-account-create" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.021424 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="af60e370-8287-408b-af8c-fc0d5c19e37d" containerName="mariadb-account-create" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.021839 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="af60e370-8287-408b-af8c-fc0d5c19e37d" containerName="mariadb-account-create" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.022801 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.027399 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.027895 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tkkrj" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.028720 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.037334 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rflnq"] Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.038831 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.051438 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j5jf6"] Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.052150 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.052177 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fm2j5" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.075040 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rflnq"] Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.089660 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.163894 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-scripts\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164185 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-combined-ca-bundle\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164218 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14b56cd8-5692-4a65-b8ad-1e39bf253846-etc-machine-id\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164261 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-db-sync-config-data\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164322 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9rdj\" (UniqueName: \"kubernetes.io/projected/fbdb37fe-4acd-400c-aff5-db2a90f07a32-kube-api-access-g9rdj\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164343 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-config-data\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164377 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lskl8\" (UniqueName: \"kubernetes.io/projected/14b56cd8-5692-4a65-b8ad-1e39bf253846-kube-api-access-lskl8\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164394 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-db-sync-config-data\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.164420 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-combined-ca-bundle\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.265193 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-scripts\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.265421 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-combined-ca-bundle\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.265507 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14b56cd8-5692-4a65-b8ad-1e39bf253846-etc-machine-id\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.265820 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-db-sync-config-data\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.265945 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9rdj\" (UniqueName: \"kubernetes.io/projected/fbdb37fe-4acd-400c-aff5-db2a90f07a32-kube-api-access-g9rdj\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.266262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-config-data\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.266388 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lskl8\" (UniqueName: \"kubernetes.io/projected/14b56cd8-5692-4a65-b8ad-1e39bf253846-kube-api-access-lskl8\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.266479 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-db-sync-config-data\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.266648 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-combined-ca-bundle\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.266041 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14b56cd8-5692-4a65-b8ad-1e39bf253846-etc-machine-id\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.271355 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-combined-ca-bundle\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.271419 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-db-sync-config-data\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.271501 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-scripts\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.271913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-config-data\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.283920 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-combined-ca-bundle\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.284250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-db-sync-config-data\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.287351 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lskl8\" (UniqueName: \"kubernetes.io/projected/14b56cd8-5692-4a65-b8ad-1e39bf253846-kube-api-access-lskl8\") pod \"cinder-db-sync-j5jf6\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.287838 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9rdj\" (UniqueName: \"kubernetes.io/projected/fbdb37fe-4acd-400c-aff5-db2a90f07a32-kube-api-access-g9rdj\") pod \"barbican-db-sync-rflnq\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.371851 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.381064 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rflnq" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.404858 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.405161 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.523004 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:54 crc kubenswrapper[4669]: I1008 21:00:54.523070 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:00:56 crc kubenswrapper[4669]: W1008 21:00:56.486895 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7d7bc9d_0611_4434_86a2_7c39ab8a86ab.slice/crio-03792e8096b3f386204bfeffdbc75d0b11d19f791dd94fe7fdf6c350c124144e WatchSource:0}: Error finding container 03792e8096b3f386204bfeffdbc75d0b11d19f791dd94fe7fdf6c350c124144e: Status 404 returned error can't find the container with id 03792e8096b3f386204bfeffdbc75d0b11d19f791dd94fe7fdf6c350c124144e Oct 08 21:00:56 crc kubenswrapper[4669]: I1008 21:00:56.937736 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c34a743-b5df-4b29-847f-521f7086fa81","Type":"ContainerStarted","Data":"9801dac8d0f73897d33b0c2b9f725769f2b58561e7b87c53152914b0524e9bbf"} Oct 08 21:00:56 crc kubenswrapper[4669]: I1008 21:00:56.942565 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab","Type":"ContainerStarted","Data":"03792e8096b3f386204bfeffdbc75d0b11d19f791dd94fe7fdf6c350c124144e"} Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.069761 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rflnq"] Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.145133 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j5jf6"] Oct 08 21:00:57 crc kubenswrapper[4669]: W1008 21:00:57.152990 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a WatchSource:0}: Error finding container c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a: Status 404 returned error can't find the container with id c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.730711 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.952628 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rflnq" event={"ID":"fbdb37fe-4acd-400c-aff5-db2a90f07a32","Type":"ContainerStarted","Data":"4605bdfb92530e52c9993533cb023224eba5b49a1a6d75532e64e0f9a535f0e8"} Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.953632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5jf6" event={"ID":"14b56cd8-5692-4a65-b8ad-1e39bf253846","Type":"ContainerStarted","Data":"c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a"} Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.954843 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c34a743-b5df-4b29-847f-521f7086fa81","Type":"ContainerStarted","Data":"41a0dd29617e88f9d18a558336ce519f0f858cb475265b82d7475c198ff3b493"} Oct 08 21:00:57 crc kubenswrapper[4669]: I1008 21:00:57.956372 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab","Type":"ContainerStarted","Data":"bd10f33f850129cd398e2d48ffb557d3d17770c379af7dee219d67831f475921"} Oct 08 21:00:58 crc kubenswrapper[4669]: I1008 21:00:58.975305 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c34a743-b5df-4b29-847f-521f7086fa81","Type":"ContainerStarted","Data":"e2d48d5cc296d13b7707a3140b030582aa094b3374277399c2e138ec3fefa784"} Oct 08 21:00:58 crc kubenswrapper[4669]: I1008 21:00:58.981354 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerStarted","Data":"bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8"} Oct 08 21:00:58 crc kubenswrapper[4669]: I1008 21:00:58.987946 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab","Type":"ContainerStarted","Data":"786145395f9c0a079cd597228d501a90e1877725dffaadb6a783c5e45825e93e"} Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.010139 4669 generic.go:334] "Generic (PLEG): container finished" podID="42ef52de-f02e-447a-8713-16ce12443117" containerID="8da2c877511a8683c6848823d437a4855d3eaa59ff371446d7dcbe07bb26d5cd" exitCode=0 Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.010200 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5btg" event={"ID":"42ef52de-f02e-447a-8713-16ce12443117","Type":"ContainerDied","Data":"8da2c877511a8683c6848823d437a4855d3eaa59ff371446d7dcbe07bb26d5cd"} Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.012626 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.012610189 podStartE2EDuration="7.012610189s" podCreationTimestamp="2025-10-08 21:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:59.001833124 +0000 UTC m=+978.694643807" watchObservedRunningTime="2025-10-08 21:00:59.012610189 +0000 UTC m=+978.705420862" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.040953 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.040932513 podStartE2EDuration="7.040932513s" podCreationTimestamp="2025-10-08 21:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:00:59.032461392 +0000 UTC m=+978.725272075" watchObservedRunningTime="2025-10-08 21:00:59.040932513 +0000 UTC m=+978.733743186" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.314368 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vcvj2"] Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.315384 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.322517 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wljg5" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.322769 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.323103 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.329663 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vcvj2"] Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.465403 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-config\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.465750 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-combined-ca-bundle\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.465779 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c68mf\" (UniqueName: \"kubernetes.io/projected/8a9dde7a-05ec-4edb-b865-8e16680527a5-kube-api-access-c68mf\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.566951 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-combined-ca-bundle\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.566996 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c68mf\" (UniqueName: \"kubernetes.io/projected/8a9dde7a-05ec-4edb-b865-8e16680527a5-kube-api-access-c68mf\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.567102 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-config\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.573265 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-config\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.585384 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-combined-ca-bundle\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.606222 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c68mf\" (UniqueName: \"kubernetes.io/projected/8a9dde7a-05ec-4edb-b865-8e16680527a5-kube-api-access-c68mf\") pod \"neutron-db-sync-vcvj2\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:00:59 crc kubenswrapper[4669]: I1008 21:00:59.664775 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.178120 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vcvj2"] Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.351016 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.486250 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgh8n\" (UniqueName: \"kubernetes.io/projected/42ef52de-f02e-447a-8713-16ce12443117-kube-api-access-bgh8n\") pod \"42ef52de-f02e-447a-8713-16ce12443117\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.486550 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-config-data\") pod \"42ef52de-f02e-447a-8713-16ce12443117\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.486589 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-fernet-keys\") pod \"42ef52de-f02e-447a-8713-16ce12443117\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.486789 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-scripts\") pod \"42ef52de-f02e-447a-8713-16ce12443117\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.486816 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-combined-ca-bundle\") pod \"42ef52de-f02e-447a-8713-16ce12443117\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.486847 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-credential-keys\") pod \"42ef52de-f02e-447a-8713-16ce12443117\" (UID: \"42ef52de-f02e-447a-8713-16ce12443117\") " Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.493253 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "42ef52de-f02e-447a-8713-16ce12443117" (UID: "42ef52de-f02e-447a-8713-16ce12443117"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.494487 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "42ef52de-f02e-447a-8713-16ce12443117" (UID: "42ef52de-f02e-447a-8713-16ce12443117"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.494737 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ef52de-f02e-447a-8713-16ce12443117-kube-api-access-bgh8n" (OuterVolumeSpecName: "kube-api-access-bgh8n") pod "42ef52de-f02e-447a-8713-16ce12443117" (UID: "42ef52de-f02e-447a-8713-16ce12443117"). InnerVolumeSpecName "kube-api-access-bgh8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.495015 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-scripts" (OuterVolumeSpecName: "scripts") pod "42ef52de-f02e-447a-8713-16ce12443117" (UID: "42ef52de-f02e-447a-8713-16ce12443117"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.524138 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-config-data" (OuterVolumeSpecName: "config-data") pod "42ef52de-f02e-447a-8713-16ce12443117" (UID: "42ef52de-f02e-447a-8713-16ce12443117"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.526710 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42ef52de-f02e-447a-8713-16ce12443117" (UID: "42ef52de-f02e-447a-8713-16ce12443117"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.589061 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.589094 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.589106 4669 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.589114 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgh8n\" (UniqueName: \"kubernetes.io/projected/42ef52de-f02e-447a-8713-16ce12443117-kube-api-access-bgh8n\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.589124 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:00 crc kubenswrapper[4669]: I1008 21:01:00.589132 4669 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42ef52de-f02e-447a-8713-16ce12443117-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.038261 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5btg" event={"ID":"42ef52de-f02e-447a-8713-16ce12443117","Type":"ContainerDied","Data":"6b5f54fe2c0ed7c01a210402571e1fac72e196465ed3533d10cb2c2cd0262574"} Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.038324 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b5f54fe2c0ed7c01a210402571e1fac72e196465ed3533d10cb2c2cd0262574" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.038275 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5btg" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.040146 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vcvj2" event={"ID":"8a9dde7a-05ec-4edb-b865-8e16680527a5","Type":"ContainerStarted","Data":"e5232846acfe858005c26feae8a9f6c2a5d6c16e613e0f54525b1bb9b6aa170f"} Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.040186 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vcvj2" event={"ID":"8a9dde7a-05ec-4edb-b865-8e16680527a5","Type":"ContainerStarted","Data":"fc4839056e5b4d028e18a35e529c5b032fe80a1761e989789461d715a5664eb0"} Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.061085 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vcvj2" podStartSLOduration=2.061065794 podStartE2EDuration="2.061065794s" podCreationTimestamp="2025-10-08 21:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:01.051848822 +0000 UTC m=+980.744659495" watchObservedRunningTime="2025-10-08 21:01:01.061065794 +0000 UTC m=+980.753876467" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.132410 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6877c668b7-ws6lk"] Oct 08 21:01:01 crc kubenswrapper[4669]: E1008 21:01:01.132813 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ef52de-f02e-447a-8713-16ce12443117" containerName="keystone-bootstrap" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.132824 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ef52de-f02e-447a-8713-16ce12443117" containerName="keystone-bootstrap" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.133021 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ef52de-f02e-447a-8713-16ce12443117" containerName="keystone-bootstrap" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.133597 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.142451 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.142759 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.143121 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.143234 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llg6k" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.143339 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.143481 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.143886 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6877c668b7-ws6lk"] Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303647 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-scripts\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303703 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4bq9\" (UniqueName: \"kubernetes.io/projected/d35cc358-f26a-4e29-a61e-cf7e82c331a7-kube-api-access-r4bq9\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303724 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-credential-keys\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303739 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-combined-ca-bundle\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303756 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-public-tls-certs\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303793 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-fernet-keys\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303881 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-internal-tls-certs\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.303916 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-config-data\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.404847 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4bq9\" (UniqueName: \"kubernetes.io/projected/d35cc358-f26a-4e29-a61e-cf7e82c331a7-kube-api-access-r4bq9\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405185 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-credential-keys\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405205 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-combined-ca-bundle\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405223 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-public-tls-certs\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405270 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-fernet-keys\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405336 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-internal-tls-certs\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405361 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-config-data\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.405388 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-scripts\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.411131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-config-data\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.413024 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-credential-keys\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.413245 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-combined-ca-bundle\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.413548 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-fernet-keys\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.413663 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-scripts\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.414245 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-internal-tls-certs\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.418890 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35cc358-f26a-4e29-a61e-cf7e82c331a7-public-tls-certs\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.427064 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4bq9\" (UniqueName: \"kubernetes.io/projected/d35cc358-f26a-4e29-a61e-cf7e82c331a7-kube-api-access-r4bq9\") pod \"keystone-6877c668b7-ws6lk\" (UID: \"d35cc358-f26a-4e29-a61e-cf7e82c331a7\") " pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:01 crc kubenswrapper[4669]: I1008 21:01:01.452096 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.003801 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.004092 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.046757 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.070927 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.081943 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.410231 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.410286 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.465659 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:03 crc kubenswrapper[4669]: I1008 21:01:03.493936 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:04 crc kubenswrapper[4669]: I1008 21:01:04.078395 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:04 crc kubenswrapper[4669]: I1008 21:01:04.078853 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 21:01:04 crc kubenswrapper[4669]: I1008 21:01:04.078872 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:04 crc kubenswrapper[4669]: I1008 21:01:04.407134 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 08 21:01:04 crc kubenswrapper[4669]: I1008 21:01:04.528136 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fc57f5668-z5dzm" podUID="43e0f642-1a58-481b-8347-b4d29176ddc5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Oct 08 21:01:05 crc kubenswrapper[4669]: I1008 21:01:05.086498 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:01:05 crc kubenswrapper[4669]: I1008 21:01:05.487746 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 21:01:05 crc kubenswrapper[4669]: I1008 21:01:05.487813 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 21:01:06 crc kubenswrapper[4669]: I1008 21:01:06.642099 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:06 crc kubenswrapper[4669]: I1008 21:01:06.642418 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:01:06 crc kubenswrapper[4669]: I1008 21:01:06.647273 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 21:01:16 crc kubenswrapper[4669]: I1008 21:01:16.210778 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:01:16 crc kubenswrapper[4669]: I1008 21:01:16.319465 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:01:17 crc kubenswrapper[4669]: E1008 21:01:17.505306 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 08 21:01:17 crc kubenswrapper[4669]: E1008 21:01:17.505738 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lskl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-j5jf6_openstack(14b56cd8-5692-4a65-b8ad-1e39bf253846): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 21:01:17 crc kubenswrapper[4669]: E1008 21:01:17.507188 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-j5jf6" podUID="14b56cd8-5692-4a65-b8ad-1e39bf253846" Oct 08 21:01:17 crc kubenswrapper[4669]: I1008 21:01:17.952422 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fc57f5668-z5dzm" Oct 08 21:01:18 crc kubenswrapper[4669]: I1008 21:01:18.009198 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cd548c4f4-74w5s"] Oct 08 21:01:18 crc kubenswrapper[4669]: I1008 21:01:18.009478 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon-log" containerID="cri-o://c0ba3b7ee1bb2b135b87938968a5b52aa40dc067d0aa4c25b461634b322cef98" gracePeriod=30 Oct 08 21:01:18 crc kubenswrapper[4669]: I1008 21:01:18.009947 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" containerID="cri-o://0b62a5077ce4a93242842e1e897e72bbe90ea93c59e2811a272696aad9a6ca8d" gracePeriod=30 Oct 08 21:01:18 crc kubenswrapper[4669]: I1008 21:01:18.022552 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 08 21:01:18 crc kubenswrapper[4669]: E1008 21:01:18.205126 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-j5jf6" podUID="14b56cd8-5692-4a65-b8ad-1e39bf253846" Oct 08 21:01:18 crc kubenswrapper[4669]: E1008 21:01:18.809991 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 08 21:01:18 crc kubenswrapper[4669]: E1008 21:01:18.810129 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4gdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e462fc4e-635f-4e2e-88c0-43f1af0dc648): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.210353 4669 generic.go:334] "Generic (PLEG): container finished" podID="8a9dde7a-05ec-4edb-b865-8e16680527a5" containerID="e5232846acfe858005c26feae8a9f6c2a5d6c16e613e0f54525b1bb9b6aa170f" exitCode=0 Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.210429 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vcvj2" event={"ID":"8a9dde7a-05ec-4edb-b865-8e16680527a5","Type":"ContainerDied","Data":"e5232846acfe858005c26feae8a9f6c2a5d6c16e613e0f54525b1bb9b6aa170f"} Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.212581 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p2bl6" event={"ID":"469707ec-e817-4d42-b406-6595799f6036","Type":"ContainerStarted","Data":"e51526a2c5f0cb4ae683f0b94b5da21317009d601e2a29c50f9241e6aa8afb0a"} Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.217205 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rflnq" event={"ID":"fbdb37fe-4acd-400c-aff5-db2a90f07a32","Type":"ContainerStarted","Data":"ce92f63cb28d86f30c343c6e01a5e1156e3b99401b7c98c55a397f77bf1ee8be"} Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.248633 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-p2bl6" podStartSLOduration=1.783261429 podStartE2EDuration="47.248617107s" podCreationTimestamp="2025-10-08 21:00:32 +0000 UTC" firstStartedPulling="2025-10-08 21:00:33.380958164 +0000 UTC m=+953.073768837" lastFinishedPulling="2025-10-08 21:01:18.846313822 +0000 UTC m=+998.539124515" observedRunningTime="2025-10-08 21:01:19.247924398 +0000 UTC m=+998.940735071" watchObservedRunningTime="2025-10-08 21:01:19.248617107 +0000 UTC m=+998.941427780" Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.267447 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6877c668b7-ws6lk"] Oct 08 21:01:19 crc kubenswrapper[4669]: I1008 21:01:19.270280 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rflnq" podStartSLOduration=4.556494543 podStartE2EDuration="26.270228908s" podCreationTimestamp="2025-10-08 21:00:53 +0000 UTC" firstStartedPulling="2025-10-08 21:00:57.078082187 +0000 UTC m=+976.770892860" lastFinishedPulling="2025-10-08 21:01:18.791816552 +0000 UTC m=+998.484627225" observedRunningTime="2025-10-08 21:01:19.262802834 +0000 UTC m=+998.955613517" watchObservedRunningTime="2025-10-08 21:01:19.270228908 +0000 UTC m=+998.963039571" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.231091 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6877c668b7-ws6lk" event={"ID":"d35cc358-f26a-4e29-a61e-cf7e82c331a7","Type":"ContainerStarted","Data":"f42e8f1429c0f3fdd777201d427b080b3a4950fd9a72a49b9105c84789d0042b"} Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.231151 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6877c668b7-ws6lk" event={"ID":"d35cc358-f26a-4e29-a61e-cf7e82c331a7","Type":"ContainerStarted","Data":"29109bd730f0185c3a94c21457f57071f95f4f9d3ab69cdd1ae2ba405598e109"} Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.232518 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.292392 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6877c668b7-ws6lk" podStartSLOduration=19.292365903 podStartE2EDuration="19.292365903s" podCreationTimestamp="2025-10-08 21:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:20.268989445 +0000 UTC m=+999.961800148" watchObservedRunningTime="2025-10-08 21:01:20.292365903 +0000 UTC m=+999.985176586" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.583256 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.701935 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-config\") pod \"8a9dde7a-05ec-4edb-b865-8e16680527a5\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.702026 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-combined-ca-bundle\") pod \"8a9dde7a-05ec-4edb-b865-8e16680527a5\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.702140 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c68mf\" (UniqueName: \"kubernetes.io/projected/8a9dde7a-05ec-4edb-b865-8e16680527a5-kube-api-access-c68mf\") pod \"8a9dde7a-05ec-4edb-b865-8e16680527a5\" (UID: \"8a9dde7a-05ec-4edb-b865-8e16680527a5\") " Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.708139 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9dde7a-05ec-4edb-b865-8e16680527a5-kube-api-access-c68mf" (OuterVolumeSpecName: "kube-api-access-c68mf") pod "8a9dde7a-05ec-4edb-b865-8e16680527a5" (UID: "8a9dde7a-05ec-4edb-b865-8e16680527a5"). InnerVolumeSpecName "kube-api-access-c68mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.732158 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-config" (OuterVolumeSpecName: "config") pod "8a9dde7a-05ec-4edb-b865-8e16680527a5" (UID: "8a9dde7a-05ec-4edb-b865-8e16680527a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.737805 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a9dde7a-05ec-4edb-b865-8e16680527a5" (UID: "8a9dde7a-05ec-4edb-b865-8e16680527a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.804970 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.805042 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c68mf\" (UniqueName: \"kubernetes.io/projected/8a9dde7a-05ec-4edb-b865-8e16680527a5-kube-api-access-c68mf\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:20 crc kubenswrapper[4669]: I1008 21:01:20.805075 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a9dde7a-05ec-4edb-b865-8e16680527a5-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.244867 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vcvj2" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.244899 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vcvj2" event={"ID":"8a9dde7a-05ec-4edb-b865-8e16680527a5","Type":"ContainerDied","Data":"fc4839056e5b4d028e18a35e529c5b032fe80a1761e989789461d715a5664eb0"} Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.244962 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc4839056e5b4d028e18a35e529c5b032fe80a1761e989789461d715a5664eb0" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.465687 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:32798->10.217.0.147:8443: read: connection reset by peer" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.476095 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-845z8"] Oct 08 21:01:21 crc kubenswrapper[4669]: E1008 21:01:21.476440 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9dde7a-05ec-4edb-b865-8e16680527a5" containerName="neutron-db-sync" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.476456 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9dde7a-05ec-4edb-b865-8e16680527a5" containerName="neutron-db-sync" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.482921 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9dde7a-05ec-4edb-b865-8e16680527a5" containerName="neutron-db-sync" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.483952 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.502484 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-845z8"] Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.607386 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f8986bb9b-rq2pj"] Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.609025 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.611937 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.612206 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.612699 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wljg5" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.613052 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.618366 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f8986bb9b-rq2pj"] Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.647627 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-httpd-config\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.647707 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.647799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfzh\" (UniqueName: \"kubernetes.io/projected/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-kube-api-access-srfzh\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.648655 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-config\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.648685 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-combined-ca-bundle\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.648727 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-ovndb-tls-certs\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.648762 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.648818 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-config\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.648999 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4brl\" (UniqueName: \"kubernetes.io/projected/67dd40b0-efa4-47ba-814f-57dcb053a2d9-kube-api-access-b4brl\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.649032 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.649049 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749795 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-config\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749834 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-combined-ca-bundle\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749867 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-ovndb-tls-certs\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749889 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749930 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-config\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749952 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4brl\" (UniqueName: \"kubernetes.io/projected/67dd40b0-efa4-47ba-814f-57dcb053a2d9-kube-api-access-b4brl\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749973 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.749991 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.750011 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-httpd-config\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.750026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.750056 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srfzh\" (UniqueName: \"kubernetes.io/projected/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-kube-api-access-srfzh\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.752016 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-config\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.752164 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.752409 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.752565 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.752583 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.756071 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-ovndb-tls-certs\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.756538 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-httpd-config\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.760562 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-combined-ca-bundle\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.766366 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-config\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.768250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srfzh\" (UniqueName: \"kubernetes.io/projected/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-kube-api-access-srfzh\") pod \"dnsmasq-dns-5ccc5c4795-845z8\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.770249 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4brl\" (UniqueName: \"kubernetes.io/projected/67dd40b0-efa4-47ba-814f-57dcb053a2d9-kube-api-access-b4brl\") pod \"neutron-f8986bb9b-rq2pj\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.908287 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:21 crc kubenswrapper[4669]: I1008 21:01:21.941762 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:22 crc kubenswrapper[4669]: I1008 21:01:22.263687 4669 generic.go:334] "Generic (PLEG): container finished" podID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerID="0b62a5077ce4a93242842e1e897e72bbe90ea93c59e2811a272696aad9a6ca8d" exitCode=0 Oct 08 21:01:22 crc kubenswrapper[4669]: I1008 21:01:22.263788 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd548c4f4-74w5s" event={"ID":"f67695c6-cc78-4e93-86e4-34b030405e0e","Type":"ContainerDied","Data":"0b62a5077ce4a93242842e1e897e72bbe90ea93c59e2811a272696aad9a6ca8d"} Oct 08 21:01:22 crc kubenswrapper[4669]: I1008 21:01:22.268590 4669 generic.go:334] "Generic (PLEG): container finished" podID="469707ec-e817-4d42-b406-6595799f6036" containerID="e51526a2c5f0cb4ae683f0b94b5da21317009d601e2a29c50f9241e6aa8afb0a" exitCode=0 Oct 08 21:01:22 crc kubenswrapper[4669]: I1008 21:01:22.268638 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p2bl6" event={"ID":"469707ec-e817-4d42-b406-6595799f6036","Type":"ContainerDied","Data":"e51526a2c5f0cb4ae683f0b94b5da21317009d601e2a29c50f9241e6aa8afb0a"} Oct 08 21:01:22 crc kubenswrapper[4669]: W1008 21:01:22.397627 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f0950ec_bab3_4ed9_bd56_f4f6f13c002b.slice/crio-d1b794ddf004edc05f95b8a017eac230133e5d7b7ecb489c96cf66f12a66f772 WatchSource:0}: Error finding container d1b794ddf004edc05f95b8a017eac230133e5d7b7ecb489c96cf66f12a66f772: Status 404 returned error can't find the container with id d1b794ddf004edc05f95b8a017eac230133e5d7b7ecb489c96cf66f12a66f772 Oct 08 21:01:22 crc kubenswrapper[4669]: I1008 21:01:22.398903 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-845z8"] Oct 08 21:01:22 crc kubenswrapper[4669]: I1008 21:01:22.809014 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f8986bb9b-rq2pj"] Oct 08 21:01:22 crc kubenswrapper[4669]: W1008 21:01:22.817642 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67dd40b0_efa4_47ba_814f_57dcb053a2d9.slice/crio-6e0752c1d7e9a538bd416d28882ae5902d4f75fd2fe0fb0d1fa6efddf055c00c WatchSource:0}: Error finding container 6e0752c1d7e9a538bd416d28882ae5902d4f75fd2fe0fb0d1fa6efddf055c00c: Status 404 returned error can't find the container with id 6e0752c1d7e9a538bd416d28882ae5902d4f75fd2fe0fb0d1fa6efddf055c00c Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.251036 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.313014 4669 generic.go:334] "Generic (PLEG): container finished" podID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerID="9e66346f86d815b3c8ee4e3aceecb89cb9f9af567e1a0167f4aa1ee4a42aa9b8" exitCode=0 Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.313073 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" event={"ID":"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b","Type":"ContainerDied","Data":"9e66346f86d815b3c8ee4e3aceecb89cb9f9af567e1a0167f4aa1ee4a42aa9b8"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.313098 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" event={"ID":"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b","Type":"ContainerStarted","Data":"d1b794ddf004edc05f95b8a017eac230133e5d7b7ecb489c96cf66f12a66f772"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321105 4669 generic.go:334] "Generic (PLEG): container finished" podID="49676f30-cc4b-4229-b194-f28688f6da28" containerID="cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234" exitCode=137 Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321129 4669 generic.go:334] "Generic (PLEG): container finished" podID="49676f30-cc4b-4229-b194-f28688f6da28" containerID="a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0" exitCode=137 Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d68bc8f57-c6jv4" event={"ID":"49676f30-cc4b-4229-b194-f28688f6da28","Type":"ContainerDied","Data":"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321234 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d68bc8f57-c6jv4" event={"ID":"49676f30-cc4b-4229-b194-f28688f6da28","Type":"ContainerDied","Data":"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321249 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d68bc8f57-c6jv4" event={"ID":"49676f30-cc4b-4229-b194-f28688f6da28","Type":"ContainerDied","Data":"303a41634ffa2f8ee63d0298844f10defdbafcd62092a5573811d4e62796c1b6"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321265 4669 scope.go:117] "RemoveContainer" containerID="cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.321421 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d68bc8f57-c6jv4" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.338981 4669 generic.go:334] "Generic (PLEG): container finished" podID="fbdb37fe-4acd-400c-aff5-db2a90f07a32" containerID="ce92f63cb28d86f30c343c6e01a5e1156e3b99401b7c98c55a397f77bf1ee8be" exitCode=0 Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.361378 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rflnq" event={"ID":"fbdb37fe-4acd-400c-aff5-db2a90f07a32","Type":"ContainerDied","Data":"ce92f63cb28d86f30c343c6e01a5e1156e3b99401b7c98c55a397f77bf1ee8be"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.361434 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8986bb9b-rq2pj" event={"ID":"67dd40b0-efa4-47ba-814f-57dcb053a2d9","Type":"ContainerStarted","Data":"b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.361480 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8986bb9b-rq2pj" event={"ID":"67dd40b0-efa4-47ba-814f-57dcb053a2d9","Type":"ContainerStarted","Data":"6e0752c1d7e9a538bd416d28882ae5902d4f75fd2fe0fb0d1fa6efddf055c00c"} Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.385269 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-scripts\") pod \"49676f30-cc4b-4229-b194-f28688f6da28\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.385309 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49676f30-cc4b-4229-b194-f28688f6da28-horizon-secret-key\") pod \"49676f30-cc4b-4229-b194-f28688f6da28\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.385466 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49676f30-cc4b-4229-b194-f28688f6da28-logs\") pod \"49676f30-cc4b-4229-b194-f28688f6da28\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.385483 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpnbd\" (UniqueName: \"kubernetes.io/projected/49676f30-cc4b-4229-b194-f28688f6da28-kube-api-access-kpnbd\") pod \"49676f30-cc4b-4229-b194-f28688f6da28\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.385502 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-config-data\") pod \"49676f30-cc4b-4229-b194-f28688f6da28\" (UID: \"49676f30-cc4b-4229-b194-f28688f6da28\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.386210 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49676f30-cc4b-4229-b194-f28688f6da28-logs" (OuterVolumeSpecName: "logs") pod "49676f30-cc4b-4229-b194-f28688f6da28" (UID: "49676f30-cc4b-4229-b194-f28688f6da28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.404984 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49676f30-cc4b-4229-b194-f28688f6da28-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "49676f30-cc4b-4229-b194-f28688f6da28" (UID: "49676f30-cc4b-4229-b194-f28688f6da28"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.407085 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-scripts" (OuterVolumeSpecName: "scripts") pod "49676f30-cc4b-4229-b194-f28688f6da28" (UID: "49676f30-cc4b-4229-b194-f28688f6da28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.413081 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49676f30-cc4b-4229-b194-f28688f6da28-kube-api-access-kpnbd" (OuterVolumeSpecName: "kube-api-access-kpnbd") pod "49676f30-cc4b-4229-b194-f28688f6da28" (UID: "49676f30-cc4b-4229-b194-f28688f6da28"). InnerVolumeSpecName "kube-api-access-kpnbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.418147 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-config-data" (OuterVolumeSpecName: "config-data") pod "49676f30-cc4b-4229-b194-f28688f6da28" (UID: "49676f30-cc4b-4229-b194-f28688f6da28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.488796 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.489118 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49676f30-cc4b-4229-b194-f28688f6da28-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.489131 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49676f30-cc4b-4229-b194-f28688f6da28-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.489140 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpnbd\" (UniqueName: \"kubernetes.io/projected/49676f30-cc4b-4229-b194-f28688f6da28-kube-api-access-kpnbd\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.489148 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49676f30-cc4b-4229-b194-f28688f6da28-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.493761 4669 scope.go:117] "RemoveContainer" containerID="a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.528026 4669 scope.go:117] "RemoveContainer" containerID="cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234" Oct 08 21:01:23 crc kubenswrapper[4669]: E1008 21:01:23.528499 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234\": container with ID starting with cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234 not found: ID does not exist" containerID="cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.528543 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234"} err="failed to get container status \"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234\": rpc error: code = NotFound desc = could not find container \"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234\": container with ID starting with cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234 not found: ID does not exist" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.528565 4669 scope.go:117] "RemoveContainer" containerID="a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0" Oct 08 21:01:23 crc kubenswrapper[4669]: E1008 21:01:23.528788 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0\": container with ID starting with a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0 not found: ID does not exist" containerID="a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.528837 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0"} err="failed to get container status \"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0\": rpc error: code = NotFound desc = could not find container \"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0\": container with ID starting with a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0 not found: ID does not exist" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.528872 4669 scope.go:117] "RemoveContainer" containerID="cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.529227 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234"} err="failed to get container status \"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234\": rpc error: code = NotFound desc = could not find container \"cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234\": container with ID starting with cfdf2ed8ae2f86fd7b7b598ea89b436c053b080dc3b07570e2e9f9d93a1ce234 not found: ID does not exist" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.529255 4669 scope.go:117] "RemoveContainer" containerID="a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.529491 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0"} err="failed to get container status \"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0\": rpc error: code = NotFound desc = could not find container \"a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0\": container with ID starting with a9132b413c37fe9f6e3fcba934efebcb60325ce3c038d45f76074026a2db71f0 not found: ID does not exist" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.721041 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p2bl6" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.723504 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d68bc8f57-c6jv4"] Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.732946 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d68bc8f57-c6jv4"] Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.895694 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-config-data\") pod \"469707ec-e817-4d42-b406-6595799f6036\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.896013 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469707ec-e817-4d42-b406-6595799f6036-logs\") pod \"469707ec-e817-4d42-b406-6595799f6036\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.896060 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-combined-ca-bundle\") pod \"469707ec-e817-4d42-b406-6595799f6036\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.896106 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvjx2\" (UniqueName: \"kubernetes.io/projected/469707ec-e817-4d42-b406-6595799f6036-kube-api-access-cvjx2\") pod \"469707ec-e817-4d42-b406-6595799f6036\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.896149 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-scripts\") pod \"469707ec-e817-4d42-b406-6595799f6036\" (UID: \"469707ec-e817-4d42-b406-6595799f6036\") " Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.900810 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469707ec-e817-4d42-b406-6595799f6036-logs" (OuterVolumeSpecName: "logs") pod "469707ec-e817-4d42-b406-6595799f6036" (UID: "469707ec-e817-4d42-b406-6595799f6036"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.905513 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-scripts" (OuterVolumeSpecName: "scripts") pod "469707ec-e817-4d42-b406-6595799f6036" (UID: "469707ec-e817-4d42-b406-6595799f6036"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.908752 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469707ec-e817-4d42-b406-6595799f6036-kube-api-access-cvjx2" (OuterVolumeSpecName: "kube-api-access-cvjx2") pod "469707ec-e817-4d42-b406-6595799f6036" (UID: "469707ec-e817-4d42-b406-6595799f6036"). InnerVolumeSpecName "kube-api-access-cvjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.925509 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-config-data" (OuterVolumeSpecName: "config-data") pod "469707ec-e817-4d42-b406-6595799f6036" (UID: "469707ec-e817-4d42-b406-6595799f6036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.925821 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "469707ec-e817-4d42-b406-6595799f6036" (UID: "469707ec-e817-4d42-b406-6595799f6036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.998204 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.998236 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvjx2\" (UniqueName: \"kubernetes.io/projected/469707ec-e817-4d42-b406-6595799f6036-kube-api-access-cvjx2\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.998259 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.998269 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/469707ec-e817-4d42-b406-6595799f6036-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:23 crc kubenswrapper[4669]: I1008 21:01:23.998277 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/469707ec-e817-4d42-b406-6595799f6036-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094171 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f968c55b5-frgnz"] Oct 08 21:01:24 crc kubenswrapper[4669]: E1008 21:01:24.094494 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094510 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon" Oct 08 21:01:24 crc kubenswrapper[4669]: E1008 21:01:24.094520 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon-log" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094541 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon-log" Oct 08 21:01:24 crc kubenswrapper[4669]: E1008 21:01:24.094575 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469707ec-e817-4d42-b406-6595799f6036" containerName="placement-db-sync" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094582 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="469707ec-e817-4d42-b406-6595799f6036" containerName="placement-db-sync" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094741 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon-log" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094760 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="49676f30-cc4b-4229-b194-f28688f6da28" containerName="horizon" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.094770 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="469707ec-e817-4d42-b406-6595799f6036" containerName="placement-db-sync" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.095614 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.098943 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.099130 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.099966 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-config\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.100005 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-combined-ca-bundle\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.100056 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-ovndb-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.100079 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-public-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.100096 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-internal-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.100152 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvz4\" (UniqueName: \"kubernetes.io/projected/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-kube-api-access-5cvz4\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.100172 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-httpd-config\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.111002 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f968c55b5-frgnz"] Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.205035 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-ovndb-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.205110 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-public-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.205139 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-internal-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.205776 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvz4\" (UniqueName: \"kubernetes.io/projected/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-kube-api-access-5cvz4\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.205840 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-httpd-config\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.205995 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-config\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.206031 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-combined-ca-bundle\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.209550 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-internal-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.209579 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-ovndb-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.209593 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-public-tls-certs\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.212841 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-combined-ca-bundle\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.214198 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-httpd-config\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.216221 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-config\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.224676 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvz4\" (UniqueName: \"kubernetes.io/projected/1e273869-5ed1-48c3-af8f-d4d2df61c9e7-kube-api-access-5cvz4\") pod \"neutron-6f968c55b5-frgnz\" (UID: \"1e273869-5ed1-48c3-af8f-d4d2df61c9e7\") " pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.357810 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8986bb9b-rq2pj" event={"ID":"67dd40b0-efa4-47ba-814f-57dcb053a2d9","Type":"ContainerStarted","Data":"d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f"} Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.357938 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.360611 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" event={"ID":"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b","Type":"ContainerStarted","Data":"0bf883ef6e476d8227ba0e40182826d221469fd4369d347f29aec8430c5f7d94"} Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.360752 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.362478 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p2bl6" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.362475 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p2bl6" event={"ID":"469707ec-e817-4d42-b406-6595799f6036","Type":"ContainerDied","Data":"641ec32f5da7e530873f54f6d5fccc9f78786edfcd2a87dea594f240eb796da8"} Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.362543 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641ec32f5da7e530873f54f6d5fccc9f78786edfcd2a87dea594f240eb796da8" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.380431 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f8986bb9b-rq2pj" podStartSLOduration=3.380386112 podStartE2EDuration="3.380386112s" podCreationTimestamp="2025-10-08 21:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:24.372587359 +0000 UTC m=+1004.065398032" watchObservedRunningTime="2025-10-08 21:01:24.380386112 +0000 UTC m=+1004.073196785" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.398364 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" podStartSLOduration=3.398346794 podStartE2EDuration="3.398346794s" podCreationTimestamp="2025-10-08 21:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:24.395079124 +0000 UTC m=+1004.087889807" watchObservedRunningTime="2025-10-08 21:01:24.398346794 +0000 UTC m=+1004.091157467" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.408307 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.425744 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.454836 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f47799b5d-n27h7"] Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.456553 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.460306 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.460710 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.461387 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f4x62" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.461599 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.461614 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.470854 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f47799b5d-n27h7"] Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.614579 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-combined-ca-bundle\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.614955 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-config-data\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.614999 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9076c9d7-726e-4b80-80af-a78887da72d1-logs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.615024 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwk7q\" (UniqueName: \"kubernetes.io/projected/9076c9d7-726e-4b80-80af-a78887da72d1-kube-api-access-zwk7q\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.615059 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-public-tls-certs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.615134 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-scripts\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.615231 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-internal-tls-certs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722588 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-combined-ca-bundle\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722634 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-config-data\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722656 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9076c9d7-726e-4b80-80af-a78887da72d1-logs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722672 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwk7q\" (UniqueName: \"kubernetes.io/projected/9076c9d7-726e-4b80-80af-a78887da72d1-kube-api-access-zwk7q\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722694 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-public-tls-certs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722731 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-scripts\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.722793 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-internal-tls-certs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.723418 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9076c9d7-726e-4b80-80af-a78887da72d1-logs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.729362 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-internal-tls-certs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.729500 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-combined-ca-bundle\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.735983 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-scripts\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.745080 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-config-data\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.745286 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9076c9d7-726e-4b80-80af-a78887da72d1-public-tls-certs\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.813494 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwk7q\" (UniqueName: \"kubernetes.io/projected/9076c9d7-726e-4b80-80af-a78887da72d1-kube-api-access-zwk7q\") pod \"placement-6f47799b5d-n27h7\" (UID: \"9076c9d7-726e-4b80-80af-a78887da72d1\") " pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.878952 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rflnq" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.927945 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-combined-ca-bundle\") pod \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.928371 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9rdj\" (UniqueName: \"kubernetes.io/projected/fbdb37fe-4acd-400c-aff5-db2a90f07a32-kube-api-access-g9rdj\") pod \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.928438 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-db-sync-config-data\") pod \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\" (UID: \"fbdb37fe-4acd-400c-aff5-db2a90f07a32\") " Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.936159 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbdb37fe-4acd-400c-aff5-db2a90f07a32-kube-api-access-g9rdj" (OuterVolumeSpecName: "kube-api-access-g9rdj") pod "fbdb37fe-4acd-400c-aff5-db2a90f07a32" (UID: "fbdb37fe-4acd-400c-aff5-db2a90f07a32"). InnerVolumeSpecName "kube-api-access-g9rdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.939829 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fbdb37fe-4acd-400c-aff5-db2a90f07a32" (UID: "fbdb37fe-4acd-400c-aff5-db2a90f07a32"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:24 crc kubenswrapper[4669]: I1008 21:01:24.984972 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbdb37fe-4acd-400c-aff5-db2a90f07a32" (UID: "fbdb37fe-4acd-400c-aff5-db2a90f07a32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.031321 4669 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.031351 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdb37fe-4acd-400c-aff5-db2a90f07a32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.031361 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9rdj\" (UniqueName: \"kubernetes.io/projected/fbdb37fe-4acd-400c-aff5-db2a90f07a32-kube-api-access-g9rdj\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.083796 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.206034 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f968c55b5-frgnz"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.393628 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49676f30-cc4b-4229-b194-f28688f6da28" path="/var/lib/kubelet/pods/49676f30-cc4b-4229-b194-f28688f6da28/volumes" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.405903 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rflnq" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.415629 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rflnq" event={"ID":"fbdb37fe-4acd-400c-aff5-db2a90f07a32","Type":"ContainerDied","Data":"4605bdfb92530e52c9993533cb023224eba5b49a1a6d75532e64e0f9a535f0e8"} Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.415687 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4605bdfb92530e52c9993533cb023224eba5b49a1a6d75532e64e0f9a535f0e8" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.545789 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8596cf69cc-5lmdk"] Oct 08 21:01:25 crc kubenswrapper[4669]: E1008 21:01:25.546257 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdb37fe-4acd-400c-aff5-db2a90f07a32" containerName="barbican-db-sync" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.546282 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdb37fe-4acd-400c-aff5-db2a90f07a32" containerName="barbican-db-sync" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.546557 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbdb37fe-4acd-400c-aff5-db2a90f07a32" containerName="barbican-db-sync" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.547725 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.551063 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fm2j5" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.551338 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.551449 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.572353 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-69d49dbff8-2tq27"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.577147 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.581171 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.589186 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8596cf69cc-5lmdk"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.626590 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69d49dbff8-2tq27"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.646795 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-logs\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.646860 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-combined-ca-bundle\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.646911 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjvjn\" (UniqueName: \"kubernetes.io/projected/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-kube-api-access-bjvjn\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.646935 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1994a50-3452-488f-b364-1b1377cfd62d-logs\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.646975 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-combined-ca-bundle\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.646996 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4x7k\" (UniqueName: \"kubernetes.io/projected/b1994a50-3452-488f-b364-1b1377cfd62d-kube-api-access-g4x7k\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.647031 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-config-data\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.647054 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-config-data\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.647072 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-config-data-custom\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.647091 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-config-data-custom\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.707826 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-845z8"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.723306 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-tw2fj"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.724924 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.742782 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-tw2fj"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748750 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-combined-ca-bundle\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4x7k\" (UniqueName: \"kubernetes.io/projected/b1994a50-3452-488f-b364-1b1377cfd62d-kube-api-access-g4x7k\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748856 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-svc\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748876 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-config-data\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748900 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-config-data\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748917 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-config-data-custom\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748932 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-config-data-custom\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748958 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-logs\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748975 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.748998 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-config\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.749023 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-combined-ca-bundle\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.749082 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvjn\" (UniqueName: \"kubernetes.io/projected/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-kube-api-access-bjvjn\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.749104 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1994a50-3452-488f-b364-1b1377cfd62d-logs\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.749123 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xz8\" (UniqueName: \"kubernetes.io/projected/a6cc8710-2436-467e-9f6f-17e820c82294-kube-api-access-68xz8\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.749138 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.753853 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-logs\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.754180 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1994a50-3452-488f-b364-1b1377cfd62d-logs\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.757260 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-combined-ca-bundle\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.757671 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-config-data-custom\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.765363 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cbf777bfb-dfzsv"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.766819 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.770807 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.772716 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-combined-ca-bundle\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.774030 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-config-data\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.790920 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cbf777bfb-dfzsv"] Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.791438 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4x7k\" (UniqueName: \"kubernetes.io/projected/b1994a50-3452-488f-b364-1b1377cfd62d-kube-api-access-g4x7k\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.792963 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-config-data\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.795936 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjvjn\" (UniqueName: \"kubernetes.io/projected/ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c-kube-api-access-bjvjn\") pod \"barbican-worker-8596cf69cc-5lmdk\" (UID: \"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c\") " pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.829647 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1994a50-3452-488f-b364-1b1377cfd62d-config-data-custom\") pod \"barbican-keystone-listener-69d49dbff8-2tq27\" (UID: \"b1994a50-3452-488f-b364-1b1377cfd62d\") " pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849788 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849833 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-config\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849872 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data-custom\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849905 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26b17f76-0da4-4221-958d-24182a0a2c90-logs\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849935 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xz8\" (UniqueName: \"kubernetes.io/projected/a6cc8710-2436-467e-9f6f-17e820c82294-kube-api-access-68xz8\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849951 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.849990 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.850008 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-combined-ca-bundle\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.850025 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkzn\" (UniqueName: \"kubernetes.io/projected/26b17f76-0da4-4221-958d-24182a0a2c90-kube-api-access-9gkzn\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.850040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.850063 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-svc\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.851553 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.852615 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-config\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.852928 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.853427 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.853769 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-svc\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.886349 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xz8\" (UniqueName: \"kubernetes.io/projected/a6cc8710-2436-467e-9f6f-17e820c82294-kube-api-access-68xz8\") pod \"dnsmasq-dns-688c87cc99-tw2fj\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.924277 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.937055 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8596cf69cc-5lmdk" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.950965 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26b17f76-0da4-4221-958d-24182a0a2c90-logs\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.951066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-combined-ca-bundle\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.951090 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gkzn\" (UniqueName: \"kubernetes.io/projected/26b17f76-0da4-4221-958d-24182a0a2c90-kube-api-access-9gkzn\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.951104 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.951201 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data-custom\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.954497 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26b17f76-0da4-4221-958d-24182a0a2c90-logs\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.956449 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data-custom\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.958297 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.970930 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-combined-ca-bundle\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.981319 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gkzn\" (UniqueName: \"kubernetes.io/projected/26b17f76-0da4-4221-958d-24182a0a2c90-kube-api-access-9gkzn\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:25 crc kubenswrapper[4669]: I1008 21:01:25.982837 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data\") pod \"barbican-api-cbf777bfb-dfzsv\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:26 crc kubenswrapper[4669]: I1008 21:01:26.243115 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:26 crc kubenswrapper[4669]: I1008 21:01:26.413300 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerName="dnsmasq-dns" containerID="cri-o://0bf883ef6e476d8227ba0e40182826d221469fd4369d347f29aec8430c5f7d94" gracePeriod=10 Oct 08 21:01:27 crc kubenswrapper[4669]: I1008 21:01:27.425935 4669 generic.go:334] "Generic (PLEG): container finished" podID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerID="0bf883ef6e476d8227ba0e40182826d221469fd4369d347f29aec8430c5f7d94" exitCode=0 Oct 08 21:01:27 crc kubenswrapper[4669]: I1008 21:01:27.426120 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" event={"ID":"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b","Type":"ContainerDied","Data":"0bf883ef6e476d8227ba0e40182826d221469fd4369d347f29aec8430c5f7d94"} Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.161041 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fbf49878d-2r5dt"] Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.164057 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.175566 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbf49878d-2r5dt"] Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.180345 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.180592 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297545 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-combined-ca-bundle\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297597 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5rr\" (UniqueName: \"kubernetes.io/projected/6311b31e-a85f-4bc0-9c1a-254c1650ef17-kube-api-access-lj5rr\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297620 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6311b31e-a85f-4bc0-9c1a-254c1650ef17-logs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297674 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-public-tls-certs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297705 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-config-data\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297724 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-config-data-custom\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.297752 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-internal-tls-certs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399300 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-config-data\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399350 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-config-data-custom\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-internal-tls-certs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399454 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-combined-ca-bundle\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399477 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5rr\" (UniqueName: \"kubernetes.io/projected/6311b31e-a85f-4bc0-9c1a-254c1650ef17-kube-api-access-lj5rr\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399495 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6311b31e-a85f-4bc0-9c1a-254c1650ef17-logs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.399557 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-public-tls-certs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.402668 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6311b31e-a85f-4bc0-9c1a-254c1650ef17-logs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.405575 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-public-tls-certs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.406415 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-internal-tls-certs\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.406675 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-combined-ca-bundle\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.406967 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-config-data-custom\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.407299 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6311b31e-a85f-4bc0-9c1a-254c1650ef17-config-data\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.424378 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5rr\" (UniqueName: \"kubernetes.io/projected/6311b31e-a85f-4bc0-9c1a-254c1650ef17-kube-api-access-lj5rr\") pod \"barbican-api-5fbf49878d-2r5dt\" (UID: \"6311b31e-a85f-4bc0-9c1a-254c1650ef17\") " pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:28 crc kubenswrapper[4669]: I1008 21:01:28.485813 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:29 crc kubenswrapper[4669]: W1008 21:01:29.868354 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e273869_5ed1_48c3_af8f_d4d2df61c9e7.slice/crio-128b844a68c8749f9495f7e4129bf6aeeb4b335f75513c01779cc86aa1e6491a WatchSource:0}: Error finding container 128b844a68c8749f9495f7e4129bf6aeeb4b335f75513c01779cc86aa1e6491a: Status 404 returned error can't find the container with id 128b844a68c8749f9495f7e4129bf6aeeb4b335f75513c01779cc86aa1e6491a Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.460286 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f968c55b5-frgnz" event={"ID":"1e273869-5ed1-48c3-af8f-d4d2df61c9e7","Type":"ContainerStarted","Data":"128b844a68c8749f9495f7e4129bf6aeeb4b335f75513c01779cc86aa1e6491a"} Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.635575 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.746245 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-nb\") pod \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.746380 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-svc\") pod \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.746399 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-swift-storage-0\") pod \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.746431 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-config\") pod \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.746477 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srfzh\" (UniqueName: \"kubernetes.io/projected/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-kube-api-access-srfzh\") pod \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.746640 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-sb\") pod \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\" (UID: \"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b\") " Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.771818 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-kube-api-access-srfzh" (OuterVolumeSpecName: "kube-api-access-srfzh") pod "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" (UID: "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b"). InnerVolumeSpecName "kube-api-access-srfzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.852457 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srfzh\" (UniqueName: \"kubernetes.io/projected/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-kube-api-access-srfzh\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.858915 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" (UID: "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.865914 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" (UID: "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.871696 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" (UID: "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.874965 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-config" (OuterVolumeSpecName: "config") pod "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" (UID: "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.882941 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" (UID: "0f0950ec-bab3-4ed9-bd56-f4f6f13c002b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.957113 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.957158 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.957171 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.957182 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.957195 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:30 crc kubenswrapper[4669]: I1008 21:01:30.966152 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbf49878d-2r5dt"] Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.070587 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69d49dbff8-2tq27"] Oct 08 21:01:31 crc kubenswrapper[4669]: W1008 21:01:31.075413 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1994a50_3452_488f_b364_1b1377cfd62d.slice/crio-cf64205e5a1630fe4ceac9ee4c9769b1a26a433d1b476950026ad2e0c752ef8e WatchSource:0}: Error finding container cf64205e5a1630fe4ceac9ee4c9769b1a26a433d1b476950026ad2e0c752ef8e: Status 404 returned error can't find the container with id cf64205e5a1630fe4ceac9ee4c9769b1a26a433d1b476950026ad2e0c752ef8e Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.239450 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f47799b5d-n27h7"] Oct 08 21:01:31 crc kubenswrapper[4669]: W1008 21:01:31.250462 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9076c9d7_726e_4b80_80af_a78887da72d1.slice/crio-9d26445908a733b8796effab46cd520d9e93225535a53fef21e8e0743b50e078 WatchSource:0}: Error finding container 9d26445908a733b8796effab46cd520d9e93225535a53fef21e8e0743b50e078: Status 404 returned error can't find the container with id 9d26445908a733b8796effab46cd520d9e93225535a53fef21e8e0743b50e078 Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.257495 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cbf777bfb-dfzsv"] Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.280231 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-tw2fj"] Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.393993 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8596cf69cc-5lmdk"] Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.476670 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbf49878d-2r5dt" event={"ID":"6311b31e-a85f-4bc0-9c1a-254c1650ef17","Type":"ContainerStarted","Data":"fa407f0b67a45d34d653ff0a64fbd55ca7ff2e205f478cbd06007f9f9e3081ed"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.478959 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.478962 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-845z8" event={"ID":"0f0950ec-bab3-4ed9-bd56-f4f6f13c002b","Type":"ContainerDied","Data":"d1b794ddf004edc05f95b8a017eac230133e5d7b7ecb489c96cf66f12a66f772"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.479008 4669 scope.go:117] "RemoveContainer" containerID="0bf883ef6e476d8227ba0e40182826d221469fd4369d347f29aec8430c5f7d94" Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.481421 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f47799b5d-n27h7" event={"ID":"9076c9d7-726e-4b80-80af-a78887da72d1","Type":"ContainerStarted","Data":"9d26445908a733b8796effab46cd520d9e93225535a53fef21e8e0743b50e078"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.485344 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" event={"ID":"b1994a50-3452-488f-b364-1b1377cfd62d","Type":"ContainerStarted","Data":"cf64205e5a1630fe4ceac9ee4c9769b1a26a433d1b476950026ad2e0c752ef8e"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.487499 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" event={"ID":"a6cc8710-2436-467e-9f6f-17e820c82294","Type":"ContainerStarted","Data":"8dc5f4ba48c4da59d0243f9d77642f40b77682af891f53903e93841e13c9159f"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.493145 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8596cf69cc-5lmdk" event={"ID":"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c","Type":"ContainerStarted","Data":"be584f5d9f7694038c080246a7a9796e3e10b16e24768da6d12f8ee66bf09eec"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.504255 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-845z8"] Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.504293 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cbf777bfb-dfzsv" event={"ID":"26b17f76-0da4-4221-958d-24182a0a2c90","Type":"ContainerStarted","Data":"daaa681aed44456281ed048b666ced493d804c57ed743b32c31ed79aef000aef"} Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.512945 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-845z8"] Oct 08 21:01:31 crc kubenswrapper[4669]: I1008 21:01:31.769236 4669 scope.go:117] "RemoveContainer" containerID="9e66346f86d815b3c8ee4e3aceecb89cb9f9af567e1a0167f4aa1ee4a42aa9b8" Oct 08 21:01:32 crc kubenswrapper[4669]: E1008 21:01:32.423587 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.513757 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f47799b5d-n27h7" event={"ID":"9076c9d7-726e-4b80-80af-a78887da72d1","Type":"ContainerStarted","Data":"5d45bde85bda756a4561f3b2eb9c945785c8908d95c41c497c8129b2b5f38933"} Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.518904 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerStarted","Data":"0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f"} Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.519017 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-central-agent" containerID="cri-o://9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3" gracePeriod=30 Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.519180 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-notification-agent" containerID="cri-o://bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8" gracePeriod=30 Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.519180 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="proxy-httpd" containerID="cri-o://0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f" gracePeriod=30 Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.519322 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.521871 4669 generic.go:334] "Generic (PLEG): container finished" podID="a6cc8710-2436-467e-9f6f-17e820c82294" containerID="b7a1b0545405540bb1fefa0d9059a376649b356272f8525f747c8cf001e8905d" exitCode=0 Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.521959 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" event={"ID":"a6cc8710-2436-467e-9f6f-17e820c82294","Type":"ContainerDied","Data":"b7a1b0545405540bb1fefa0d9059a376649b356272f8525f747c8cf001e8905d"} Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.544863 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cbf777bfb-dfzsv" event={"ID":"26b17f76-0da4-4221-958d-24182a0a2c90","Type":"ContainerStarted","Data":"5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe"} Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.549822 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f968c55b5-frgnz" event={"ID":"1e273869-5ed1-48c3-af8f-d4d2df61c9e7","Type":"ContainerStarted","Data":"bf44bdcd2238eb1feee04023a666530bbe0853d539ee359a66f1e3cb156bf97c"} Oct 08 21:01:32 crc kubenswrapper[4669]: I1008 21:01:32.550871 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbf49878d-2r5dt" event={"ID":"6311b31e-a85f-4bc0-9c1a-254c1650ef17","Type":"ContainerStarted","Data":"97e04152b88d07f8365e96b5de25392c95c5b98247ef85c474ae6f3ebf067a30"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.345583 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" path="/var/lib/kubelet/pods/0f0950ec-bab3-4ed9-bd56-f4f6f13c002b/volumes" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.563013 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f968c55b5-frgnz" event={"ID":"1e273869-5ed1-48c3-af8f-d4d2df61c9e7","Type":"ContainerStarted","Data":"7ddb9e464525776465a3d8df3063b9d95c717b5b59f3c213f559fea21f01cdce"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.563493 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.566765 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbf49878d-2r5dt" event={"ID":"6311b31e-a85f-4bc0-9c1a-254c1650ef17","Type":"ContainerStarted","Data":"0db196bdf07fceb885026cb0afdf0dbc83de8737612b586bf554276355a6165c"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.566902 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.568977 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f47799b5d-n27h7" event={"ID":"9076c9d7-726e-4b80-80af-a78887da72d1","Type":"ContainerStarted","Data":"723ac7b3c6f36b4455b6c478cb236e851d262fbf74d5f9d1192cdcca80394226"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.569067 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.569088 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.571155 4669 generic.go:334] "Generic (PLEG): container finished" podID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerID="0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f" exitCode=0 Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.571174 4669 generic.go:334] "Generic (PLEG): container finished" podID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerID="9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3" exitCode=0 Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.571227 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerDied","Data":"0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.571261 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerDied","Data":"9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.573316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" event={"ID":"a6cc8710-2436-467e-9f6f-17e820c82294","Type":"ContainerStarted","Data":"ca71ec32857884eda0a0506814c7f0aff98dfabf9cde0620138134fb472c6436"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.573372 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.576185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cbf777bfb-dfzsv" event={"ID":"26b17f76-0da4-4221-958d-24182a0a2c90","Type":"ContainerStarted","Data":"56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0"} Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.576271 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.576306 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.581858 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6877c668b7-ws6lk" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.585189 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f968c55b5-frgnz" podStartSLOduration=9.585168763 podStartE2EDuration="9.585168763s" podCreationTimestamp="2025-10-08 21:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:33.583439385 +0000 UTC m=+1013.276250058" watchObservedRunningTime="2025-10-08 21:01:33.585168763 +0000 UTC m=+1013.277979436" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.625625 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f47799b5d-n27h7" podStartSLOduration=9.625604228 podStartE2EDuration="9.625604228s" podCreationTimestamp="2025-10-08 21:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:33.621775113 +0000 UTC m=+1013.314585786" watchObservedRunningTime="2025-10-08 21:01:33.625604228 +0000 UTC m=+1013.318414921" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.648472 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" podStartSLOduration=8.648449593 podStartE2EDuration="8.648449593s" podCreationTimestamp="2025-10-08 21:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:33.642817349 +0000 UTC m=+1013.335628032" watchObservedRunningTime="2025-10-08 21:01:33.648449593 +0000 UTC m=+1013.341260276" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.676493 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fbf49878d-2r5dt" podStartSLOduration=5.676474599 podStartE2EDuration="5.676474599s" podCreationTimestamp="2025-10-08 21:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:33.666201128 +0000 UTC m=+1013.359011801" watchObservedRunningTime="2025-10-08 21:01:33.676474599 +0000 UTC m=+1013.369285272" Oct 08 21:01:33 crc kubenswrapper[4669]: I1008 21:01:33.689261 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-cbf777bfb-dfzsv" podStartSLOduration=8.689245737 podStartE2EDuration="8.689245737s" podCreationTimestamp="2025-10-08 21:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:33.687311454 +0000 UTC m=+1013.380122117" watchObservedRunningTime="2025-10-08 21:01:33.689245737 +0000 UTC m=+1013.382056410" Oct 08 21:01:34 crc kubenswrapper[4669]: I1008 21:01:34.405877 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 08 21:01:34 crc kubenswrapper[4669]: I1008 21:01:34.592154 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5jf6" event={"ID":"14b56cd8-5692-4a65-b8ad-1e39bf253846","Type":"ContainerStarted","Data":"eb1864d51f9a83b3a0594f07487e8a54b8d75be746e794b1b6931c856df5fbb7"} Oct 08 21:01:34 crc kubenswrapper[4669]: I1008 21:01:34.593199 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:34 crc kubenswrapper[4669]: I1008 21:01:34.611459 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-j5jf6" podStartSLOduration=6.025015838 podStartE2EDuration="41.611443841s" podCreationTimestamp="2025-10-08 21:00:53 +0000 UTC" firstStartedPulling="2025-10-08 21:00:57.167601234 +0000 UTC m=+976.860411907" lastFinishedPulling="2025-10-08 21:01:32.754029227 +0000 UTC m=+1012.446839910" observedRunningTime="2025-10-08 21:01:34.611197784 +0000 UTC m=+1014.304008477" watchObservedRunningTime="2025-10-08 21:01:34.611443841 +0000 UTC m=+1014.304254514" Oct 08 21:01:35 crc kubenswrapper[4669]: I1008 21:01:35.624681 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" event={"ID":"b1994a50-3452-488f-b364-1b1377cfd62d","Type":"ContainerStarted","Data":"019c0c17a1f3cd095ed95031eea22cebbb2d7cd3197ecba3e1ab4af5e49f82bd"} Oct 08 21:01:35 crc kubenswrapper[4669]: I1008 21:01:35.625053 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" event={"ID":"b1994a50-3452-488f-b364-1b1377cfd62d","Type":"ContainerStarted","Data":"6b7024fa80ecfbd117d8b5fd5465c88b9f56bfd5378a0c225a82ec638d047baf"} Oct 08 21:01:35 crc kubenswrapper[4669]: I1008 21:01:35.631138 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8596cf69cc-5lmdk" event={"ID":"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c","Type":"ContainerStarted","Data":"fac005d0a1a8ecf8156ed8cd8088aed31fa2cfa2cfb467cd4204583241941ee4"} Oct 08 21:01:35 crc kubenswrapper[4669]: I1008 21:01:35.631194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8596cf69cc-5lmdk" event={"ID":"ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c","Type":"ContainerStarted","Data":"a1797742166a22dc72cb7ace71d00bd1986234df092989939513477aabd8d0e1"} Oct 08 21:01:35 crc kubenswrapper[4669]: I1008 21:01:35.659302 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8596cf69cc-5lmdk" podStartSLOduration=7.066167738 podStartE2EDuration="10.659280329s" podCreationTimestamp="2025-10-08 21:01:25 +0000 UTC" firstStartedPulling="2025-10-08 21:01:31.410962931 +0000 UTC m=+1011.103773604" lastFinishedPulling="2025-10-08 21:01:35.004075532 +0000 UTC m=+1014.696886195" observedRunningTime="2025-10-08 21:01:35.653984724 +0000 UTC m=+1015.346795407" watchObservedRunningTime="2025-10-08 21:01:35.659280329 +0000 UTC m=+1015.352091002" Oct 08 21:01:36 crc kubenswrapper[4669]: I1008 21:01:36.660171 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-69d49dbff8-2tq27" podStartSLOduration=7.746821451 podStartE2EDuration="11.660147954s" podCreationTimestamp="2025-10-08 21:01:25 +0000 UTC" firstStartedPulling="2025-10-08 21:01:31.084895179 +0000 UTC m=+1010.777705852" lastFinishedPulling="2025-10-08 21:01:34.998221682 +0000 UTC m=+1014.691032355" observedRunningTime="2025-10-08 21:01:36.65893278 +0000 UTC m=+1016.351743453" watchObservedRunningTime="2025-10-08 21:01:36.660147954 +0000 UTC m=+1016.352958627" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.813331 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5f9445f759-bx7xs"] Oct 08 21:01:37 crc kubenswrapper[4669]: E1008 21:01:37.813701 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerName="dnsmasq-dns" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.813715 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerName="dnsmasq-dns" Oct 08 21:01:37 crc kubenswrapper[4669]: E1008 21:01:37.813741 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerName="init" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.813747 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerName="init" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.813941 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0950ec-bab3-4ed9-bd56-f4f6f13c002b" containerName="dnsmasq-dns" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.815038 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.816898 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.817156 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.818412 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 08 21:01:37 crc kubenswrapper[4669]: I1008 21:01:37.872754 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f9445f759-bx7xs"] Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007026 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/895cb1ba-c212-4908-82e9-f5042a50686f-etc-swift\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007093 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/895cb1ba-c212-4908-82e9-f5042a50686f-run-httpd\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007207 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpz9\" (UniqueName: \"kubernetes.io/projected/895cb1ba-c212-4908-82e9-f5042a50686f-kube-api-access-twpz9\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007249 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-combined-ca-bundle\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007310 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-internal-tls-certs\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007329 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-config-data\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007400 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-public-tls-certs\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.007433 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/895cb1ba-c212-4908-82e9-f5042a50686f-log-httpd\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109336 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/895cb1ba-c212-4908-82e9-f5042a50686f-etc-swift\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/895cb1ba-c212-4908-82e9-f5042a50686f-run-httpd\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109419 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpz9\" (UniqueName: \"kubernetes.io/projected/895cb1ba-c212-4908-82e9-f5042a50686f-kube-api-access-twpz9\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109442 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-combined-ca-bundle\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109479 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-internal-tls-certs\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109500 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-config-data\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109557 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-public-tls-certs\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.109586 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/895cb1ba-c212-4908-82e9-f5042a50686f-log-httpd\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.110263 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/895cb1ba-c212-4908-82e9-f5042a50686f-log-httpd\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.110832 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/895cb1ba-c212-4908-82e9-f5042a50686f-run-httpd\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.118129 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-internal-tls-certs\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.118225 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-public-tls-certs\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.118435 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-config-data\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.119228 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895cb1ba-c212-4908-82e9-f5042a50686f-combined-ca-bundle\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.119345 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/895cb1ba-c212-4908-82e9-f5042a50686f-etc-swift\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.139514 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpz9\" (UniqueName: \"kubernetes.io/projected/895cb1ba-c212-4908-82e9-f5042a50686f-kube-api-access-twpz9\") pod \"swift-proxy-5f9445f759-bx7xs\" (UID: \"895cb1ba-c212-4908-82e9-f5042a50686f\") " pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.260368 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.261756 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.264138 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wsvpf" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.264151 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.264154 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.270313 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.416378 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3bf86-2604-4d46-bc73-13b5d049b01c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.416487 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2aa3bf86-2604-4d46-bc73-13b5d049b01c-openstack-config\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.416695 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2aa3bf86-2604-4d46-bc73-13b5d049b01c-openstack-config-secret\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.416832 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wln\" (UniqueName: \"kubernetes.io/projected/2aa3bf86-2604-4d46-bc73-13b5d049b01c-kube-api-access-d8wln\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.440280 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.518605 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3bf86-2604-4d46-bc73-13b5d049b01c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.518699 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2aa3bf86-2604-4d46-bc73-13b5d049b01c-openstack-config\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.518789 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2aa3bf86-2604-4d46-bc73-13b5d049b01c-openstack-config-secret\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.518837 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wln\" (UniqueName: \"kubernetes.io/projected/2aa3bf86-2604-4d46-bc73-13b5d049b01c-kube-api-access-d8wln\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.520931 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2aa3bf86-2604-4d46-bc73-13b5d049b01c-openstack-config\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.527338 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3bf86-2604-4d46-bc73-13b5d049b01c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.543107 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wln\" (UniqueName: \"kubernetes.io/projected/2aa3bf86-2604-4d46-bc73-13b5d049b01c-kube-api-access-d8wln\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.561982 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2aa3bf86-2604-4d46-bc73-13b5d049b01c-openstack-config-secret\") pod \"openstackclient\" (UID: \"2aa3bf86-2604-4d46-bc73-13b5d049b01c\") " pod="openstack/openstackclient" Oct 08 21:01:38 crc kubenswrapper[4669]: I1008 21:01:38.582766 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.162283 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f9445f759-bx7xs"] Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.177277 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.689088 4669 generic.go:334] "Generic (PLEG): container finished" podID="14b56cd8-5692-4a65-b8ad-1e39bf253846" containerID="eb1864d51f9a83b3a0594f07487e8a54b8d75be746e794b1b6931c856df5fbb7" exitCode=0 Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.689179 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5jf6" event={"ID":"14b56cd8-5692-4a65-b8ad-1e39bf253846","Type":"ContainerDied","Data":"eb1864d51f9a83b3a0594f07487e8a54b8d75be746e794b1b6931c856df5fbb7"} Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.690723 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2aa3bf86-2604-4d46-bc73-13b5d049b01c","Type":"ContainerStarted","Data":"fe31b63d11edec456ccacba858076ee0fd7fcc625881f7ad28e39c93baef9927"} Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.692905 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f9445f759-bx7xs" event={"ID":"895cb1ba-c212-4908-82e9-f5042a50686f","Type":"ContainerStarted","Data":"4c7f74025713aeb93ca3feb5921b40d83134cda5f90c376a3730d785cc43018f"} Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.692933 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f9445f759-bx7xs" event={"ID":"895cb1ba-c212-4908-82e9-f5042a50686f","Type":"ContainerStarted","Data":"f7eb339d38d9771f9cae91c29af40b1d4af32b2986b3c4b78a89e0a10c5373b0"} Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.692942 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f9445f759-bx7xs" event={"ID":"895cb1ba-c212-4908-82e9-f5042a50686f","Type":"ContainerStarted","Data":"308e8a8b7906a475f1eb5ba6902a94ff76cfc4e459eebb7e7438c31478e8e112"} Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.693031 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.693073 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:39 crc kubenswrapper[4669]: I1008 21:01:39.731946 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5f9445f759-bx7xs" podStartSLOduration=2.7319277360000003 podStartE2EDuration="2.731927736s" podCreationTimestamp="2025-10-08 21:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:39.729274094 +0000 UTC m=+1019.422084777" watchObservedRunningTime="2025-10-08 21:01:39.731927736 +0000 UTC m=+1019.424738409" Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.247746 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.764709 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbf49878d-2r5dt" Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.833926 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cbf777bfb-dfzsv"] Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.834189 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" containerID="cri-o://5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe" gracePeriod=30 Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.835034 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" containerID="cri-o://56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0" gracePeriod=30 Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.843155 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.844392 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Oct 08 21:01:40 crc kubenswrapper[4669]: I1008 21:01:40.929701 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.010101 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jnx9v"] Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.010325 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerName="dnsmasq-dns" containerID="cri-o://3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee" gracePeriod=10 Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.270586 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.394925 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-db-sync-config-data\") pod \"14b56cd8-5692-4a65-b8ad-1e39bf253846\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.395037 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14b56cd8-5692-4a65-b8ad-1e39bf253846-etc-machine-id\") pod \"14b56cd8-5692-4a65-b8ad-1e39bf253846\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.395103 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lskl8\" (UniqueName: \"kubernetes.io/projected/14b56cd8-5692-4a65-b8ad-1e39bf253846-kube-api-access-lskl8\") pod \"14b56cd8-5692-4a65-b8ad-1e39bf253846\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.395163 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-scripts\") pod \"14b56cd8-5692-4a65-b8ad-1e39bf253846\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.395185 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-combined-ca-bundle\") pod \"14b56cd8-5692-4a65-b8ad-1e39bf253846\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.395308 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-config-data\") pod \"14b56cd8-5692-4a65-b8ad-1e39bf253846\" (UID: \"14b56cd8-5692-4a65-b8ad-1e39bf253846\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.396673 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14b56cd8-5692-4a65-b8ad-1e39bf253846-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14b56cd8-5692-4a65-b8ad-1e39bf253846" (UID: "14b56cd8-5692-4a65-b8ad-1e39bf253846"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.411555 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14b56cd8-5692-4a65-b8ad-1e39bf253846" (UID: "14b56cd8-5692-4a65-b8ad-1e39bf253846"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.424109 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-scripts" (OuterVolumeSpecName: "scripts") pod "14b56cd8-5692-4a65-b8ad-1e39bf253846" (UID: "14b56cd8-5692-4a65-b8ad-1e39bf253846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.431719 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b56cd8-5692-4a65-b8ad-1e39bf253846-kube-api-access-lskl8" (OuterVolumeSpecName: "kube-api-access-lskl8") pod "14b56cd8-5692-4a65-b8ad-1e39bf253846" (UID: "14b56cd8-5692-4a65-b8ad-1e39bf253846"). InnerVolumeSpecName "kube-api-access-lskl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.501359 4669 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.501405 4669 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14b56cd8-5692-4a65-b8ad-1e39bf253846-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.501418 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lskl8\" (UniqueName: \"kubernetes.io/projected/14b56cd8-5692-4a65-b8ad-1e39bf253846-kube-api-access-lskl8\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.501428 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.504725 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14b56cd8-5692-4a65-b8ad-1e39bf253846" (UID: "14b56cd8-5692-4a65-b8ad-1e39bf253846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.567705 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.585708 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-config-data" (OuterVolumeSpecName: "config-data") pod "14b56cd8-5692-4a65-b8ad-1e39bf253846" (UID: "14b56cd8-5692-4a65-b8ad-1e39bf253846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.603415 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.603447 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b56cd8-5692-4a65-b8ad-1e39bf253846-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.704117 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-svc\") pod \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.704208 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2pxt\" (UniqueName: \"kubernetes.io/projected/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-kube-api-access-t2pxt\") pod \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.704377 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-swift-storage-0\") pod \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.704407 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-config\") pod \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.704443 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-nb\") pod \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.704505 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-sb\") pod \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\" (UID: \"c019d0a1-4d0f-40e8-8720-f20b74d33b4b\") " Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.711147 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-kube-api-access-t2pxt" (OuterVolumeSpecName: "kube-api-access-t2pxt") pod "c019d0a1-4d0f-40e8-8720-f20b74d33b4b" (UID: "c019d0a1-4d0f-40e8-8720-f20b74d33b4b"). InnerVolumeSpecName "kube-api-access-t2pxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.736733 4669 generic.go:334] "Generic (PLEG): container finished" podID="26b17f76-0da4-4221-958d-24182a0a2c90" containerID="5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe" exitCode=143 Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.736882 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cbf777bfb-dfzsv" event={"ID":"26b17f76-0da4-4221-958d-24182a0a2c90","Type":"ContainerDied","Data":"5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe"} Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.758350 4669 generic.go:334] "Generic (PLEG): container finished" podID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerID="3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee" exitCode=0 Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.758448 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" event={"ID":"c019d0a1-4d0f-40e8-8720-f20b74d33b4b","Type":"ContainerDied","Data":"3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee"} Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.758476 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" event={"ID":"c019d0a1-4d0f-40e8-8720-f20b74d33b4b","Type":"ContainerDied","Data":"fa74df3317031d2aefab15b1a27751fb618e603643c06f9c4f9237ca96c6db27"} Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.758493 4669 scope.go:117] "RemoveContainer" containerID="3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.758635 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jnx9v" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.765144 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j5jf6" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.765870 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j5jf6" event={"ID":"14b56cd8-5692-4a65-b8ad-1e39bf253846","Type":"ContainerDied","Data":"c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a"} Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.765921 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.781921 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c019d0a1-4d0f-40e8-8720-f20b74d33b4b" (UID: "c019d0a1-4d0f-40e8-8720-f20b74d33b4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.790177 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c019d0a1-4d0f-40e8-8720-f20b74d33b4b" (UID: "c019d0a1-4d0f-40e8-8720-f20b74d33b4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.796945 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c019d0a1-4d0f-40e8-8720-f20b74d33b4b" (UID: "c019d0a1-4d0f-40e8-8720-f20b74d33b4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.805300 4669 scope.go:117] "RemoveContainer" containerID="b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.806207 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.806225 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.806235 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.806246 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2pxt\" (UniqueName: \"kubernetes.io/projected/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-kube-api-access-t2pxt\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.812090 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-config" (OuterVolumeSpecName: "config") pod "c019d0a1-4d0f-40e8-8720-f20b74d33b4b" (UID: "c019d0a1-4d0f-40e8-8720-f20b74d33b4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.829000 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c019d0a1-4d0f-40e8-8720-f20b74d33b4b" (UID: "c019d0a1-4d0f-40e8-8720-f20b74d33b4b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.832140 4669 scope.go:117] "RemoveContainer" containerID="3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee" Oct 08 21:01:41 crc kubenswrapper[4669]: E1008 21:01:41.832391 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee\": container with ID starting with 3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee not found: ID does not exist" containerID="3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.832413 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee"} err="failed to get container status \"3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee\": rpc error: code = NotFound desc = could not find container \"3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee\": container with ID starting with 3d3dd5b2fe6065487b260b76f26d85eaa3bc45cd8ba4e63ec27cc827742c3cee not found: ID does not exist" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.832451 4669 scope.go:117] "RemoveContainer" containerID="b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760" Oct 08 21:01:41 crc kubenswrapper[4669]: E1008 21:01:41.832831 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760\": container with ID starting with b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760 not found: ID does not exist" containerID="b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.832868 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760"} err="failed to get container status \"b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760\": rpc error: code = NotFound desc = could not find container \"b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760\": container with ID starting with b47cc587e42db6dbeba24fc85dc034263b50f1225a77adfe823d6070f586d760 not found: ID does not exist" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.907641 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:41 crc kubenswrapper[4669]: I1008 21:01:41.907676 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c019d0a1-4d0f-40e8-8720-f20b74d33b4b-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.004543 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:42 crc kubenswrapper[4669]: E1008 21:01:42.005217 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b56cd8-5692-4a65-b8ad-1e39bf253846" containerName="cinder-db-sync" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.005335 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b56cd8-5692-4a65-b8ad-1e39bf253846" containerName="cinder-db-sync" Oct 08 21:01:42 crc kubenswrapper[4669]: E1008 21:01:42.005402 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerName="init" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.005456 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerName="init" Oct 08 21:01:42 crc kubenswrapper[4669]: E1008 21:01:42.005542 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerName="dnsmasq-dns" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.005605 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerName="dnsmasq-dns" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.005820 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b56cd8-5692-4a65-b8ad-1e39bf253846" containerName="cinder-db-sync" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.005908 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" containerName="dnsmasq-dns" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.006939 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.018020 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.018234 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.018488 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tkkrj" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.018487 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.038592 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.073121 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7hsvp"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.077422 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.078619 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7hsvp"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.110500 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.110566 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a987589-96bc-46ca-b779-71aae81674e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.110587 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.110636 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.110683 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.110701 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjmnq\" (UniqueName: \"kubernetes.io/projected/1a987589-96bc-46ca-b779-71aae81674e7-kube-api-access-sjmnq\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.183318 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jnx9v"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.203216 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jnx9v"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216447 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216494 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxhj\" (UniqueName: \"kubernetes.io/projected/f1293b5c-61ce-47f1-a06c-d794994c81f7-kube-api-access-prxhj\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216520 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216593 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216617 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjmnq\" (UniqueName: \"kubernetes.io/projected/1a987589-96bc-46ca-b779-71aae81674e7-kube-api-access-sjmnq\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216651 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216671 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-config\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216704 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216731 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a987589-96bc-46ca-b779-71aae81674e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216749 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216769 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.216790 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.217495 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a987589-96bc-46ca-b779-71aae81674e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.230078 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.231393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.237992 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.238476 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.244456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjmnq\" (UniqueName: \"kubernetes.io/projected/1a987589-96bc-46ca-b779-71aae81674e7-kube-api-access-sjmnq\") pod \"cinder-scheduler-0\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.260488 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.267468 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.270790 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.281697 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.319581 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.319625 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-config\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.319699 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.319727 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.319764 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.319781 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prxhj\" (UniqueName: \"kubernetes.io/projected/f1293b5c-61ce-47f1-a06c-d794994c81f7-kube-api-access-prxhj\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.320336 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.320705 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.320955 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.321226 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.338082 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-config\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.361984 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.382375 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxhj\" (UniqueName: \"kubernetes.io/projected/f1293b5c-61ce-47f1-a06c-d794994c81f7-kube-api-access-prxhj\") pod \"dnsmasq-dns-6bb4fc677f-7hsvp\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.415002 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.446820 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data-custom\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.446936 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-logs\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.447054 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.447079 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.447182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pf9v\" (UniqueName: \"kubernetes.io/projected/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-kube-api-access-8pf9v\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.447239 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.447317 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-scripts\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.549949 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-scripts\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550225 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data-custom\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550256 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-logs\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550302 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550319 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550364 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pf9v\" (UniqueName: \"kubernetes.io/projected/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-kube-api-access-8pf9v\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550392 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.550886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-logs\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.551164 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.556735 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.566089 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-scripts\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.571146 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.580179 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data-custom\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.587244 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pf9v\" (UniqueName: \"kubernetes.io/projected/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-kube-api-access-8pf9v\") pod \"cinder-api-0\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.702950 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 21:01:42 crc kubenswrapper[4669]: I1008 21:01:42.811149 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7hsvp"] Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.094542 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.195934 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.353123 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c019d0a1-4d0f-40e8-8720-f20b74d33b4b" path="/var/lib/kubelet/pods/c019d0a1-4d0f-40e8-8720-f20b74d33b4b/volumes" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.384888 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475144 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-sg-core-conf-yaml\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475227 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-log-httpd\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475268 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-combined-ca-bundle\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475313 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-scripts\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475345 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-run-httpd\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475369 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4gdb\" (UniqueName: \"kubernetes.io/projected/e462fc4e-635f-4e2e-88c0-43f1af0dc648-kube-api-access-m4gdb\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.475395 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-config-data\") pod \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\" (UID: \"e462fc4e-635f-4e2e-88c0-43f1af0dc648\") " Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.477207 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.484945 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.485767 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e462fc4e-635f-4e2e-88c0-43f1af0dc648-kube-api-access-m4gdb" (OuterVolumeSpecName: "kube-api-access-m4gdb") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "kube-api-access-m4gdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.489683 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.500369 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-scripts" (OuterVolumeSpecName: "scripts") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.572571 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.578956 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.578982 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.578993 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.579003 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.579012 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e462fc4e-635f-4e2e-88c0-43f1af0dc648-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.579020 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4gdb\" (UniqueName: \"kubernetes.io/projected/e462fc4e-635f-4e2e-88c0-43f1af0dc648-kube-api-access-m4gdb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.662843 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-config-data" (OuterVolumeSpecName: "config-data") pod "e462fc4e-635f-4e2e-88c0-43f1af0dc648" (UID: "e462fc4e-635f-4e2e-88c0-43f1af0dc648"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.681702 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462fc4e-635f-4e2e-88c0-43f1af0dc648-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.783604 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a987589-96bc-46ca-b779-71aae81674e7","Type":"ContainerStarted","Data":"3c1e3b0d3ba0ae4acf437b074a0e1d08a87d7cac94e94953e805ae8328aae5f0"} Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.787114 4669 generic.go:334] "Generic (PLEG): container finished" podID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerID="bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8" exitCode=0 Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.787169 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerDied","Data":"bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8"} Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.787214 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.787229 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e462fc4e-635f-4e2e-88c0-43f1af0dc648","Type":"ContainerDied","Data":"b44142857e977940082c8433b0ad224c252fbccc15b141b45cf6602e63297811"} Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.787246 4669 scope.go:117] "RemoveContainer" containerID="0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.790251 4669 generic.go:334] "Generic (PLEG): container finished" podID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerID="0b46ad63619da8550faed69c810af7109139e2aa004b65527888e2d323cdfa71" exitCode=0 Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.790274 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" event={"ID":"f1293b5c-61ce-47f1-a06c-d794994c81f7","Type":"ContainerDied","Data":"0b46ad63619da8550faed69c810af7109139e2aa004b65527888e2d323cdfa71"} Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.790304 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" event={"ID":"f1293b5c-61ce-47f1-a06c-d794994c81f7","Type":"ContainerStarted","Data":"de8a8907bb615f51390d53286a104a9a25d66bce0f1928c94ba3efffd7eab244"} Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.792449 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ae450f5-b69a-46d3-8d0c-593da6ed3c77","Type":"ContainerStarted","Data":"46dcaa708a767b4171afe7fbab273a94e703835eae8892c352533739d8cfcbbf"} Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.875330 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.882796 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.890468 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:43 crc kubenswrapper[4669]: E1008 21:01:43.890905 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-notification-agent" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.890922 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-notification-agent" Oct 08 21:01:43 crc kubenswrapper[4669]: E1008 21:01:43.890932 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-central-agent" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.890938 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-central-agent" Oct 08 21:01:43 crc kubenswrapper[4669]: E1008 21:01:43.890975 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="proxy-httpd" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.890981 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="proxy-httpd" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.891209 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="proxy-httpd" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.891223 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-central-agent" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.891240 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" containerName="ceilometer-notification-agent" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.892891 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.895501 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.896255 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.898957 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.981413 4669 scope.go:117] "RemoveContainer" containerID="bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.985969 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-config-data\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.986112 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-scripts\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.986156 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.986180 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.986213 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.986238 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfc64\" (UniqueName: \"kubernetes.io/projected/147336fb-662e-4c52-b4ea-d15280858c0b-kube-api-access-jfc64\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:43 crc kubenswrapper[4669]: I1008 21:01:43.986267 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.056045 4669 scope.go:117] "RemoveContainer" containerID="9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087698 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087736 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087765 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087786 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfc64\" (UniqueName: \"kubernetes.io/projected/147336fb-662e-4c52-b4ea-d15280858c0b-kube-api-access-jfc64\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087810 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087854 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-config-data\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.087936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-scripts\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.088432 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.088664 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.092738 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.093550 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-config-data\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.099062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.099663 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-scripts\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.101682 4669 scope.go:117] "RemoveContainer" containerID="0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f" Oct 08 21:01:44 crc kubenswrapper[4669]: E1008 21:01:44.106755 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f\": container with ID starting with 0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f not found: ID does not exist" containerID="0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.106787 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f"} err="failed to get container status \"0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f\": rpc error: code = NotFound desc = could not find container \"0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f\": container with ID starting with 0db3b5067db586946886641110909d2028625030aa760e3a85f383ea6914ca4f not found: ID does not exist" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.106831 4669 scope.go:117] "RemoveContainer" containerID="bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.107160 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfc64\" (UniqueName: \"kubernetes.io/projected/147336fb-662e-4c52-b4ea-d15280858c0b-kube-api-access-jfc64\") pod \"ceilometer-0\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: E1008 21:01:44.107760 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8\": container with ID starting with bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8 not found: ID does not exist" containerID="bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.107820 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8"} err="failed to get container status \"bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8\": rpc error: code = NotFound desc = could not find container \"bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8\": container with ID starting with bd567b9cca2ae6188364d4b08396258f2960d08287b1bbe923d3d8e23056f3e8 not found: ID does not exist" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.107841 4669 scope.go:117] "RemoveContainer" containerID="9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3" Oct 08 21:01:44 crc kubenswrapper[4669]: E1008 21:01:44.110821 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3\": container with ID starting with 9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3 not found: ID does not exist" containerID="9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.110857 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3"} err="failed to get container status \"9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3\": rpc error: code = NotFound desc = could not find container \"9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3\": container with ID starting with 9e1af400bb2d905081af982aff3b1dab1a685cee69b050f31bd2640ce602cba3 not found: ID does not exist" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.222144 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.409149 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.428013 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.818361 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" event={"ID":"f1293b5c-61ce-47f1-a06c-d794994c81f7","Type":"ContainerStarted","Data":"5e460d76360b08e655955b0b3b171efbb586d7b0976d20a1d19f7d1da176bd68"} Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.818774 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.819477 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.823169 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ae450f5-b69a-46d3-8d0c-593da6ed3c77","Type":"ContainerStarted","Data":"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e"} Oct 08 21:01:44 crc kubenswrapper[4669]: I1008 21:01:44.850175 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" podStartSLOduration=2.850148899 podStartE2EDuration="2.850148899s" podCreationTimestamp="2025-10-08 21:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:44.835621412 +0000 UTC m=+1024.528432085" watchObservedRunningTime="2025-10-08 21:01:44.850148899 +0000 UTC m=+1024.542959582" Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.343163 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e462fc4e-635f-4e2e-88c0-43f1af0dc648" path="/var/lib/kubelet/pods/e462fc4e-635f-4e2e-88c0-43f1af0dc648/volumes" Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.839520 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a987589-96bc-46ca-b779-71aae81674e7","Type":"ContainerStarted","Data":"b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e"} Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.839835 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a987589-96bc-46ca-b779-71aae81674e7","Type":"ContainerStarted","Data":"1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba"} Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.846247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerStarted","Data":"f949451c28ef1f3d0387b0eaa555611b9cbacb81b7af64780eaeb87d7e7b0024"} Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.846287 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerStarted","Data":"25ebd7f10f1da5c27cb71706005530fa087bb42ca80e754ae4fc0eec9e7c64ea"} Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.849733 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ae450f5-b69a-46d3-8d0c-593da6ed3c77","Type":"ContainerStarted","Data":"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a"} Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.849820 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api-log" containerID="cri-o://19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e" gracePeriod=30 Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.849836 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.849862 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api" containerID="cri-o://9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a" gracePeriod=30 Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.862009 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.924633065 podStartE2EDuration="4.861989383s" podCreationTimestamp="2025-10-08 21:01:41 +0000 UTC" firstStartedPulling="2025-10-08 21:01:43.119804709 +0000 UTC m=+1022.812615382" lastFinishedPulling="2025-10-08 21:01:44.057161027 +0000 UTC m=+1023.749971700" observedRunningTime="2025-10-08 21:01:45.861904231 +0000 UTC m=+1025.554714904" watchObservedRunningTime="2025-10-08 21:01:45.861989383 +0000 UTC m=+1025.554800056" Oct 08 21:01:45 crc kubenswrapper[4669]: I1008 21:01:45.885062 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.885021673 podStartE2EDuration="3.885021673s" podCreationTimestamp="2025-10-08 21:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:45.880979262 +0000 UTC m=+1025.573789935" watchObservedRunningTime="2025-10-08 21:01:45.885021673 +0000 UTC m=+1025.577832336" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.328796 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.328738 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.496636 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.649797 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-logs\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.649881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data-custom\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.649911 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.650011 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pf9v\" (UniqueName: \"kubernetes.io/projected/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-kube-api-access-8pf9v\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.650084 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-scripts\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.650139 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-combined-ca-bundle\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.650160 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-etc-machine-id\") pod \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\" (UID: \"8ae450f5-b69a-46d3-8d0c-593da6ed3c77\") " Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.650676 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.651719 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-logs" (OuterVolumeSpecName: "logs") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.655296 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.656726 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-scripts" (OuterVolumeSpecName: "scripts") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.665715 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-kube-api-access-8pf9v" (OuterVolumeSpecName: "kube-api-access-8pf9v") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "kube-api-access-8pf9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.710820 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data" (OuterVolumeSpecName: "config-data") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.727794 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ae450f5-b69a-46d3-8d0c-593da6ed3c77" (UID: "8ae450f5-b69a-46d3-8d0c-593da6ed3c77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752035 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pf9v\" (UniqueName: \"kubernetes.io/projected/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-kube-api-access-8pf9v\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752069 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752080 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752087 4669 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752096 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752103 4669 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.752112 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae450f5-b69a-46d3-8d0c-593da6ed3c77-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871418 4669 generic.go:334] "Generic (PLEG): container finished" podID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerID="9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a" exitCode=0 Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871445 4669 generic.go:334] "Generic (PLEG): container finished" podID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerID="19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e" exitCode=143 Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871484 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ae450f5-b69a-46d3-8d0c-593da6ed3c77","Type":"ContainerDied","Data":"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a"} Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871515 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ae450f5-b69a-46d3-8d0c-593da6ed3c77","Type":"ContainerDied","Data":"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e"} Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871545 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ae450f5-b69a-46d3-8d0c-593da6ed3c77","Type":"ContainerDied","Data":"46dcaa708a767b4171afe7fbab273a94e703835eae8892c352533739d8cfcbbf"} Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871560 4669 scope.go:117] "RemoveContainer" containerID="9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.871660 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.896187 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerStarted","Data":"e582a6d13146410466a491b737b47fde9511066d9e9d5b365f72375d7e700246"} Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.904628 4669 scope.go:117] "RemoveContainer" containerID="19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.909244 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.942669 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.944849 4669 scope.go:117] "RemoveContainer" containerID="9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a" Oct 08 21:01:46 crc kubenswrapper[4669]: E1008 21:01:46.947476 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a\": container with ID starting with 9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a not found: ID does not exist" containerID="9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.947507 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a"} err="failed to get container status \"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a\": rpc error: code = NotFound desc = could not find container \"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a\": container with ID starting with 9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a not found: ID does not exist" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.947541 4669 scope.go:117] "RemoveContainer" containerID="19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e" Oct 08 21:01:46 crc kubenswrapper[4669]: E1008 21:01:46.950805 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e\": container with ID starting with 19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e not found: ID does not exist" containerID="19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.950852 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e"} err="failed to get container status \"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e\": rpc error: code = NotFound desc = could not find container \"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e\": container with ID starting with 19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e not found: ID does not exist" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.950891 4669 scope.go:117] "RemoveContainer" containerID="9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.954133 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a"} err="failed to get container status \"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a\": rpc error: code = NotFound desc = could not find container \"9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a\": container with ID starting with 9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a not found: ID does not exist" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.954167 4669 scope.go:117] "RemoveContainer" containerID="19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.955956 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e"} err="failed to get container status \"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e\": rpc error: code = NotFound desc = could not find container \"19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e\": container with ID starting with 19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e not found: ID does not exist" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.956003 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:46 crc kubenswrapper[4669]: E1008 21:01:46.956446 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api-log" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.956470 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api-log" Oct 08 21:01:46 crc kubenswrapper[4669]: E1008 21:01:46.956495 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.956505 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.956751 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api-log" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.956779 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" containerName="cinder-api" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.958045 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.966132 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.966134 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.967177 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 08 21:01:46 crc kubenswrapper[4669]: I1008 21:01:46.969232 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056448 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-config-data\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056491 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxm9m\" (UniqueName: \"kubernetes.io/projected/42082ba5-0485-486f-8b1c-17cf7c0fc405-kube-api-access-sxm9m\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056516 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42082ba5-0485-486f-8b1c-17cf7c0fc405-logs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056544 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056576 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056703 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-public-tls-certs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056773 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-config-data-custom\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.056889 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-scripts\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.057119 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42082ba5-0485-486f-8b1c-17cf7c0fc405-etc-machine-id\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160352 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42082ba5-0485-486f-8b1c-17cf7c0fc405-etc-machine-id\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160423 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-config-data\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxm9m\" (UniqueName: \"kubernetes.io/projected/42082ba5-0485-486f-8b1c-17cf7c0fc405-kube-api-access-sxm9m\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160494 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42082ba5-0485-486f-8b1c-17cf7c0fc405-logs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160494 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42082ba5-0485-486f-8b1c-17cf7c0fc405-etc-machine-id\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160518 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.162608 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.162684 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-public-tls-certs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.162729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-config-data-custom\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.162806 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-scripts\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.160919 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42082ba5-0485-486f-8b1c-17cf7c0fc405-logs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.165820 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-config-data\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.170168 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-config-data-custom\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.170743 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.170745 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.170836 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-scripts\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.171315 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42082ba5-0485-486f-8b1c-17cf7c0fc405-public-tls-certs\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.181090 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxm9m\" (UniqueName: \"kubernetes.io/projected/42082ba5-0485-486f-8b1c-17cf7c0fc405-kube-api-access-sxm9m\") pod \"cinder-api-0\" (UID: \"42082ba5-0485-486f-8b1c-17cf7c0fc405\") " pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.264176 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:57954->10.217.0.163:9311: read: connection reset by peer" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.264765 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cbf777bfb-dfzsv" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:57964->10.217.0.163:9311: read: connection reset by peer" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.282379 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.369132 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae450f5-b69a-46d3-8d0c-593da6ed3c77" path="/var/lib/kubelet/pods/8ae450f5-b69a-46d3-8d0c-593da6ed3c77/volumes" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.369817 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.788364 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.865807 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.905315 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"42082ba5-0485-486f-8b1c-17cf7c0fc405","Type":"ContainerStarted","Data":"2d2431f4a4e731ceba93e2d9868638f08c7a336a70701d90221a88703dc4449a"} Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.910241 4669 generic.go:334] "Generic (PLEG): container finished" podID="26b17f76-0da4-4221-958d-24182a0a2c90" containerID="56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0" exitCode=0 Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.910293 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbf777bfb-dfzsv" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.910357 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cbf777bfb-dfzsv" event={"ID":"26b17f76-0da4-4221-958d-24182a0a2c90","Type":"ContainerDied","Data":"56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0"} Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.910386 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cbf777bfb-dfzsv" event={"ID":"26b17f76-0da4-4221-958d-24182a0a2c90","Type":"ContainerDied","Data":"daaa681aed44456281ed048b666ced493d804c57ed743b32c31ed79aef000aef"} Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.910404 4669 scope.go:117] "RemoveContainer" containerID="56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.922934 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerStarted","Data":"e1eeeb51efc7f70c4e8c83b4115b6ea79127acaced4cf630d61071ae814c9ba4"} Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.978685 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-combined-ca-bundle\") pod \"26b17f76-0da4-4221-958d-24182a0a2c90\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.978774 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data-custom\") pod \"26b17f76-0da4-4221-958d-24182a0a2c90\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.978834 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gkzn\" (UniqueName: \"kubernetes.io/projected/26b17f76-0da4-4221-958d-24182a0a2c90-kube-api-access-9gkzn\") pod \"26b17f76-0da4-4221-958d-24182a0a2c90\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.978866 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data\") pod \"26b17f76-0da4-4221-958d-24182a0a2c90\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.978942 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26b17f76-0da4-4221-958d-24182a0a2c90-logs\") pod \"26b17f76-0da4-4221-958d-24182a0a2c90\" (UID: \"26b17f76-0da4-4221-958d-24182a0a2c90\") " Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.980007 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b17f76-0da4-4221-958d-24182a0a2c90-logs" (OuterVolumeSpecName: "logs") pod "26b17f76-0da4-4221-958d-24182a0a2c90" (UID: "26b17f76-0da4-4221-958d-24182a0a2c90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.984228 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b17f76-0da4-4221-958d-24182a0a2c90-kube-api-access-9gkzn" (OuterVolumeSpecName: "kube-api-access-9gkzn") pod "26b17f76-0da4-4221-958d-24182a0a2c90" (UID: "26b17f76-0da4-4221-958d-24182a0a2c90"). InnerVolumeSpecName "kube-api-access-9gkzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:47 crc kubenswrapper[4669]: I1008 21:01:47.984348 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "26b17f76-0da4-4221-958d-24182a0a2c90" (UID: "26b17f76-0da4-4221-958d-24182a0a2c90"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.009911 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26b17f76-0da4-4221-958d-24182a0a2c90" (UID: "26b17f76-0da4-4221-958d-24182a0a2c90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:48 crc kubenswrapper[4669]: W1008 21:01:48.039461 4669 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-conmon-19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-conmon-19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e.scope: no such file or directory Oct 08 21:01:48 crc kubenswrapper[4669]: W1008 21:01:48.039947 4669 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-19b2bda7ff5c422f81c26f26153864b4f299291a3252798be2294bffca67805e.scope: no such file or directory Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.042250 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data" (OuterVolumeSpecName: "config-data") pod "26b17f76-0da4-4221-958d-24182a0a2c90" (UID: "26b17f76-0da4-4221-958d-24182a0a2c90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:48 crc kubenswrapper[4669]: W1008 21:01:48.053609 4669 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-conmon-9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-conmon-9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a.scope: no such file or directory Oct 08 21:01:48 crc kubenswrapper[4669]: W1008 21:01:48.053648 4669 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae450f5_b69a_46d3_8d0c_593da6ed3c77.slice/crio-9fbc3ec45bd7cc2a63f775ae4c586d0cc5d32706bd2c82a8ab2744c3937d7a6a.scope: no such file or directory Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.081367 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26b17f76-0da4-4221-958d-24182a0a2c90-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.082121 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.082195 4669 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.082257 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gkzn\" (UniqueName: \"kubernetes.io/projected/26b17f76-0da4-4221-958d-24182a0a2c90-kube-api-access-9gkzn\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.082312 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b17f76-0da4-4221-958d-24182a0a2c90-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.271154 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cbf777bfb-dfzsv"] Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.279467 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-cbf777bfb-dfzsv"] Oct 08 21:01:48 crc kubenswrapper[4669]: E1008 21:01:48.293458 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67695c6_cc78_4e93_86e4_34b030405e0e.slice/crio-conmon-c0ba3b7ee1bb2b135b87938968a5b52aa40dc067d0aa4c25b461634b322cef98.scope\": RecentStats: unable to find data in memory cache]" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.450131 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.450563 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f9445f759-bx7xs" Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.816756 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.938926 4669 generic.go:334] "Generic (PLEG): container finished" podID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerID="c0ba3b7ee1bb2b135b87938968a5b52aa40dc067d0aa4c25b461634b322cef98" exitCode=137 Oct 08 21:01:48 crc kubenswrapper[4669]: I1008 21:01:48.940105 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd548c4f4-74w5s" event={"ID":"f67695c6-cc78-4e93-86e4-34b030405e0e","Type":"ContainerDied","Data":"c0ba3b7ee1bb2b135b87938968a5b52aa40dc067d0aa4c25b461634b322cef98"} Oct 08 21:01:49 crc kubenswrapper[4669]: I1008 21:01:49.344142 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" path="/var/lib/kubelet/pods/26b17f76-0da4-4221-958d-24182a0a2c90/volumes" Oct 08 21:01:51 crc kubenswrapper[4669]: I1008 21:01:51.951493 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.417680 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.467564 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-tw2fj"] Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.468397 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" containerName="dnsmasq-dns" containerID="cri-o://ca71ec32857884eda0a0506814c7f0aff98dfabf9cde0620138134fb472c6436" gracePeriod=10 Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.653453 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.715850 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.990894 4669 generic.go:334] "Generic (PLEG): container finished" podID="a6cc8710-2436-467e-9f6f-17e820c82294" containerID="ca71ec32857884eda0a0506814c7f0aff98dfabf9cde0620138134fb472c6436" exitCode=0 Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.991089 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="cinder-scheduler" containerID="cri-o://1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba" gracePeriod=30 Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.991415 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" event={"ID":"a6cc8710-2436-467e-9f6f-17e820c82294","Type":"ContainerDied","Data":"ca71ec32857884eda0a0506814c7f0aff98dfabf9cde0620138134fb472c6436"} Oct 08 21:01:52 crc kubenswrapper[4669]: I1008 21:01:52.991686 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="probe" containerID="cri-o://b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e" gracePeriod=30 Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.009097 4669 generic.go:334] "Generic (PLEG): container finished" podID="1a987589-96bc-46ca-b779-71aae81674e7" containerID="b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e" exitCode=0 Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.009298 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a987589-96bc-46ca-b779-71aae81674e7","Type":"ContainerDied","Data":"b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e"} Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.162425 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-blx6m"] Oct 08 21:01:54 crc kubenswrapper[4669]: E1008 21:01:54.162762 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.162779 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" Oct 08 21:01:54 crc kubenswrapper[4669]: E1008 21:01:54.162809 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.162815 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.162981 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.162994 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b17f76-0da4-4221-958d-24182a0a2c90" containerName="barbican-api-log" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.163915 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.174695 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-blx6m"] Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.260236 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tjtxx"] Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.261618 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.266725 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tjtxx"] Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.342249 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfpvr\" (UniqueName: \"kubernetes.io/projected/fbed2eec-1d0d-49aa-89b2-69b961ee76b8-kube-api-access-hfpvr\") pod \"nova-api-db-create-blx6m\" (UID: \"fbed2eec-1d0d-49aa-89b2-69b961ee76b8\") " pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.359175 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lg6tp"] Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.360231 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.370846 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lg6tp"] Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.419265 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6cd548c4f4-74w5s" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.443745 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr757\" (UniqueName: \"kubernetes.io/projected/7ecc6d74-0b2e-4265-80ab-1229ce7427d2-kube-api-access-pr757\") pod \"nova-cell0-db-create-tjtxx\" (UID: \"7ecc6d74-0b2e-4265-80ab-1229ce7427d2\") " pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.443870 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfpvr\" (UniqueName: \"kubernetes.io/projected/fbed2eec-1d0d-49aa-89b2-69b961ee76b8-kube-api-access-hfpvr\") pod \"nova-api-db-create-blx6m\" (UID: \"fbed2eec-1d0d-49aa-89b2-69b961ee76b8\") " pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.443949 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m7vl\" (UniqueName: \"kubernetes.io/projected/9f5f87b9-2c24-4a27-9a32-1258486274ac-kube-api-access-7m7vl\") pod \"nova-cell1-db-create-lg6tp\" (UID: \"9f5f87b9-2c24-4a27-9a32-1258486274ac\") " pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.452985 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f968c55b5-frgnz" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.466410 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfpvr\" (UniqueName: \"kubernetes.io/projected/fbed2eec-1d0d-49aa-89b2-69b961ee76b8-kube-api-access-hfpvr\") pod \"nova-api-db-create-blx6m\" (UID: \"fbed2eec-1d0d-49aa-89b2-69b961ee76b8\") " pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.481643 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.515465 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f8986bb9b-rq2pj"] Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.515958 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f8986bb9b-rq2pj" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-api" containerID="cri-o://b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a" gracePeriod=30 Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.516052 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f8986bb9b-rq2pj" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-httpd" containerID="cri-o://d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f" gracePeriod=30 Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.545917 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr757\" (UniqueName: \"kubernetes.io/projected/7ecc6d74-0b2e-4265-80ab-1229ce7427d2-kube-api-access-pr757\") pod \"nova-cell0-db-create-tjtxx\" (UID: \"7ecc6d74-0b2e-4265-80ab-1229ce7427d2\") " pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.546039 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7vl\" (UniqueName: \"kubernetes.io/projected/9f5f87b9-2c24-4a27-9a32-1258486274ac-kube-api-access-7m7vl\") pod \"nova-cell1-db-create-lg6tp\" (UID: \"9f5f87b9-2c24-4a27-9a32-1258486274ac\") " pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.581216 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr757\" (UniqueName: \"kubernetes.io/projected/7ecc6d74-0b2e-4265-80ab-1229ce7427d2-kube-api-access-pr757\") pod \"nova-cell0-db-create-tjtxx\" (UID: \"7ecc6d74-0b2e-4265-80ab-1229ce7427d2\") " pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.583594 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.589131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m7vl\" (UniqueName: \"kubernetes.io/projected/9f5f87b9-2c24-4a27-9a32-1258486274ac-kube-api-access-7m7vl\") pod \"nova-cell1-db-create-lg6tp\" (UID: \"9f5f87b9-2c24-4a27-9a32-1258486274ac\") " pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.729901 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.782324 4669 scope.go:117] "RemoveContainer" containerID="5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.908096 4669 scope.go:117] "RemoveContainer" containerID="56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0" Oct 08 21:01:54 crc kubenswrapper[4669]: E1008 21:01:54.908591 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0\": container with ID starting with 56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0 not found: ID does not exist" containerID="56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.908637 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0"} err="failed to get container status \"56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0\": rpc error: code = NotFound desc = could not find container \"56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0\": container with ID starting with 56ecef6ad5139c116cad9c6bdf5db2370548e0ef8f931839daccfc0062b3b6a0 not found: ID does not exist" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.908671 4669 scope.go:117] "RemoveContainer" containerID="5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe" Oct 08 21:01:54 crc kubenswrapper[4669]: E1008 21:01:54.909588 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe\": container with ID starting with 5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe not found: ID does not exist" containerID="5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe" Oct 08 21:01:54 crc kubenswrapper[4669]: I1008 21:01:54.909614 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe"} err="failed to get container status \"5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe\": rpc error: code = NotFound desc = could not find container \"5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe\": container with ID starting with 5ed90ff07ca0cbb8cb7e0a52f587329ae662c7ff6c64eb0858afd03a09872efe not found: ID does not exist" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.081723 4669 generic.go:334] "Generic (PLEG): container finished" podID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerID="d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f" exitCode=0 Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.081896 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8986bb9b-rq2pj" event={"ID":"67dd40b0-efa4-47ba-814f-57dcb053a2d9","Type":"ContainerDied","Data":"d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f"} Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.220635 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.254836 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368549 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xz8\" (UniqueName: \"kubernetes.io/projected/a6cc8710-2436-467e-9f6f-17e820c82294-kube-api-access-68xz8\") pod \"a6cc8710-2436-467e-9f6f-17e820c82294\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368607 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-nb\") pod \"a6cc8710-2436-467e-9f6f-17e820c82294\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368634 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-sb\") pod \"a6cc8710-2436-467e-9f6f-17e820c82294\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368666 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlgkr\" (UniqueName: \"kubernetes.io/projected/f67695c6-cc78-4e93-86e4-34b030405e0e-kube-api-access-jlgkr\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368698 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-combined-ca-bundle\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368737 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-tls-certs\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368772 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-svc\") pod \"a6cc8710-2436-467e-9f6f-17e820c82294\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368854 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368893 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-secret-key\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368922 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-swift-storage-0\") pod \"a6cc8710-2436-467e-9f6f-17e820c82294\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368967 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-scripts\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.368987 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67695c6-cc78-4e93-86e4-34b030405e0e-logs\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.369003 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-config\") pod \"a6cc8710-2436-467e-9f6f-17e820c82294\" (UID: \"a6cc8710-2436-467e-9f6f-17e820c82294\") " Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.370579 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67695c6-cc78-4e93-86e4-34b030405e0e-logs" (OuterVolumeSpecName: "logs") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.375361 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67695c6-cc78-4e93-86e4-34b030405e0e-kube-api-access-jlgkr" (OuterVolumeSpecName: "kube-api-access-jlgkr") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "kube-api-access-jlgkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.375778 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cc8710-2436-467e-9f6f-17e820c82294-kube-api-access-68xz8" (OuterVolumeSpecName: "kube-api-access-68xz8") pod "a6cc8710-2436-467e-9f6f-17e820c82294" (UID: "a6cc8710-2436-467e-9f6f-17e820c82294"). InnerVolumeSpecName "kube-api-access-68xz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.442240 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.471190 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.471416 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67695c6-cc78-4e93-86e4-34b030405e0e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.471425 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xz8\" (UniqueName: \"kubernetes.io/projected/a6cc8710-2436-467e-9f6f-17e820c82294-kube-api-access-68xz8\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.471434 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlgkr\" (UniqueName: \"kubernetes.io/projected/f67695c6-cc78-4e93-86e4-34b030405e0e-kube-api-access-jlgkr\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.492140 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-blx6m"] Oct 08 21:01:55 crc kubenswrapper[4669]: W1008 21:01:55.531114 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbed2eec_1d0d_49aa_89b2_69b961ee76b8.slice/crio-a38365c4697e83b538c7fe835cc4741f3860ba9209e8714a5a37f211f10bf2d9 WatchSource:0}: Error finding container a38365c4697e83b538c7fe835cc4741f3860ba9209e8714a5a37f211f10bf2d9: Status 404 returned error can't find the container with id a38365c4697e83b538c7fe835cc4741f3860ba9209e8714a5a37f211f10bf2d9 Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.585367 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lg6tp"] Oct 08 21:01:55 crc kubenswrapper[4669]: W1008 21:01:55.633751 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f5f87b9_2c24_4a27_9a32_1258486274ac.slice/crio-0ff11e0fba3a01b3929b5fd24d6fb3aa45b767dcb3594e617b28fec6aa6d9258 WatchSource:0}: Error finding container 0ff11e0fba3a01b3929b5fd24d6fb3aa45b767dcb3594e617b28fec6aa6d9258: Status 404 returned error can't find the container with id 0ff11e0fba3a01b3929b5fd24d6fb3aa45b767dcb3594e617b28fec6aa6d9258 Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.679623 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data" (OuterVolumeSpecName: "config-data") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.680080 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data\") pod \"f67695c6-cc78-4e93-86e4-34b030405e0e\" (UID: \"f67695c6-cc78-4e93-86e4-34b030405e0e\") " Oct 08 21:01:55 crc kubenswrapper[4669]: W1008 21:01:55.680753 4669 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f67695c6-cc78-4e93-86e4-34b030405e0e/volumes/kubernetes.io~configmap/config-data Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.680834 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data" (OuterVolumeSpecName: "config-data") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.700182 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tjtxx"] Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.704685 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: W1008 21:01:55.709394 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ecc6d74_0b2e_4265_80ab_1229ce7427d2.slice/crio-71fd6f54efa66db63e2590d5488bb595a7a1391cce5f5cb01ce70d3277ca290d WatchSource:0}: Error finding container 71fd6f54efa66db63e2590d5488bb595a7a1391cce5f5cb01ce70d3277ca290d: Status 404 returned error can't find the container with id 71fd6f54efa66db63e2590d5488bb595a7a1391cce5f5cb01ce70d3277ca290d Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.737566 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-scripts" (OuterVolumeSpecName: "scripts") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.782756 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.782995 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67695c6-cc78-4e93-86e4-34b030405e0e-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.783110 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.793321 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6cc8710-2436-467e-9f6f-17e820c82294" (UID: "a6cc8710-2436-467e-9f6f-17e820c82294"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.793486 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6cc8710-2436-467e-9f6f-17e820c82294" (UID: "a6cc8710-2436-467e-9f6f-17e820c82294"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.810937 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6cc8710-2436-467e-9f6f-17e820c82294" (UID: "a6cc8710-2436-467e-9f6f-17e820c82294"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.826851 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f67695c6-cc78-4e93-86e4-34b030405e0e" (UID: "f67695c6-cc78-4e93-86e4-34b030405e0e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.853052 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-config" (OuterVolumeSpecName: "config") pod "a6cc8710-2436-467e-9f6f-17e820c82294" (UID: "a6cc8710-2436-467e-9f6f-17e820c82294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.858102 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6cc8710-2436-467e-9f6f-17e820c82294" (UID: "a6cc8710-2436-467e-9f6f-17e820c82294"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.884664 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.884703 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.884715 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.884727 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.884737 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f67695c6-cc78-4e93-86e4-34b030405e0e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:55 crc kubenswrapper[4669]: I1008 21:01:55.884750 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cc8710-2436-467e-9f6f-17e820c82294-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.102811 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-blx6m" event={"ID":"fbed2eec-1d0d-49aa-89b2-69b961ee76b8","Type":"ContainerStarted","Data":"47c2d4a9b75cdf000f25a82826567661d74f9443d269eb51bfecf4852e9aeed0"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.102851 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-blx6m" event={"ID":"fbed2eec-1d0d-49aa-89b2-69b961ee76b8","Type":"ContainerStarted","Data":"a38365c4697e83b538c7fe835cc4741f3860ba9209e8714a5a37f211f10bf2d9"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.105675 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" event={"ID":"a6cc8710-2436-467e-9f6f-17e820c82294","Type":"ContainerDied","Data":"8dc5f4ba48c4da59d0243f9d77642f40b77682af891f53903e93841e13c9159f"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.105712 4669 scope.go:117] "RemoveContainer" containerID="ca71ec32857884eda0a0506814c7f0aff98dfabf9cde0620138134fb472c6436" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.105799 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-tw2fj" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.110416 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tjtxx" event={"ID":"7ecc6d74-0b2e-4265-80ab-1229ce7427d2","Type":"ContainerStarted","Data":"71fd6f54efa66db63e2590d5488bb595a7a1391cce5f5cb01ce70d3277ca290d"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.112751 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"42082ba5-0485-486f-8b1c-17cf7c0fc405","Type":"ContainerStarted","Data":"cf98a5c45be67a3e17621273cb2f408ee38013a910da468b9c62f88e2479753e"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.120243 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-blx6m" podStartSLOduration=2.120225535 podStartE2EDuration="2.120225535s" podCreationTimestamp="2025-10-08 21:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:56.118074266 +0000 UTC m=+1035.810884939" watchObservedRunningTime="2025-10-08 21:01:56.120225535 +0000 UTC m=+1035.813036208" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.129491 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerStarted","Data":"98887884702e90cc6ce112b73b8b77d61fa06055512d7a7fa9402d6d337f4412"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.129753 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-central-agent" containerID="cri-o://f949451c28ef1f3d0387b0eaa555611b9cbacb81b7af64780eaeb87d7e7b0024" gracePeriod=30 Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.129789 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.130044 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="proxy-httpd" containerID="cri-o://98887884702e90cc6ce112b73b8b77d61fa06055512d7a7fa9402d6d337f4412" gracePeriod=30 Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.130216 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="sg-core" containerID="cri-o://e1eeeb51efc7f70c4e8c83b4115b6ea79127acaced4cf630d61071ae814c9ba4" gracePeriod=30 Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.130395 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-notification-agent" containerID="cri-o://e582a6d13146410466a491b737b47fde9511066d9e9d5b365f72375d7e700246" gracePeriod=30 Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.143197 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2aa3bf86-2604-4d46-bc73-13b5d049b01c","Type":"ContainerStarted","Data":"475ed174987374f41bceb7c039a0d816bc3d33f310246153791322a899e86277"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.154095 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lg6tp" event={"ID":"9f5f87b9-2c24-4a27-9a32-1258486274ac","Type":"ContainerStarted","Data":"0ff11e0fba3a01b3929b5fd24d6fb3aa45b767dcb3594e617b28fec6aa6d9258"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.158334 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.021004889 podStartE2EDuration="13.158319246s" podCreationTimestamp="2025-10-08 21:01:43 +0000 UTC" firstStartedPulling="2025-10-08 21:01:44.867782141 +0000 UTC m=+1024.560592814" lastFinishedPulling="2025-10-08 21:01:55.005096498 +0000 UTC m=+1034.697907171" observedRunningTime="2025-10-08 21:01:56.154176193 +0000 UTC m=+1035.846986866" watchObservedRunningTime="2025-10-08 21:01:56.158319246 +0000 UTC m=+1035.851129919" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.166483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd548c4f4-74w5s" event={"ID":"f67695c6-cc78-4e93-86e4-34b030405e0e","Type":"ContainerDied","Data":"ec0702da548413ceeefd10476549f8dbe3ddf463ed7e46916de59b919f286070"} Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.166591 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd548c4f4-74w5s" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.176787 4669 scope.go:117] "RemoveContainer" containerID="b7a1b0545405540bb1fefa0d9059a376649b356272f8525f747c8cf001e8905d" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.177192 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-lg6tp" podStartSLOduration=2.177181592 podStartE2EDuration="2.177181592s" podCreationTimestamp="2025-10-08 21:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:56.172397301 +0000 UTC m=+1035.865207974" watchObservedRunningTime="2025-10-08 21:01:56.177181592 +0000 UTC m=+1035.869992265" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.188481 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.458879275 podStartE2EDuration="18.188465301s" podCreationTimestamp="2025-10-08 21:01:38 +0000 UTC" firstStartedPulling="2025-10-08 21:01:39.187843916 +0000 UTC m=+1018.880654589" lastFinishedPulling="2025-10-08 21:01:54.917429942 +0000 UTC m=+1034.610240615" observedRunningTime="2025-10-08 21:01:56.186596129 +0000 UTC m=+1035.879406802" watchObservedRunningTime="2025-10-08 21:01:56.188465301 +0000 UTC m=+1035.881275974" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.218938 4669 scope.go:117] "RemoveContainer" containerID="0b62a5077ce4a93242842e1e897e72bbe90ea93c59e2811a272696aad9a6ca8d" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.221131 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cd548c4f4-74w5s"] Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.230625 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cd548c4f4-74w5s"] Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.243009 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-tw2fj"] Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.250136 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-tw2fj"] Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.449516 4669 scope.go:117] "RemoveContainer" containerID="c0ba3b7ee1bb2b135b87938968a5b52aa40dc067d0aa4c25b461634b322cef98" Oct 08 21:01:56 crc kubenswrapper[4669]: I1008 21:01:56.504899 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196018 4669 generic.go:334] "Generic (PLEG): container finished" podID="147336fb-662e-4c52-b4ea-d15280858c0b" containerID="98887884702e90cc6ce112b73b8b77d61fa06055512d7a7fa9402d6d337f4412" exitCode=0 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196609 4669 generic.go:334] "Generic (PLEG): container finished" podID="147336fb-662e-4c52-b4ea-d15280858c0b" containerID="e1eeeb51efc7f70c4e8c83b4115b6ea79127acaced4cf630d61071ae814c9ba4" exitCode=2 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196634 4669 generic.go:334] "Generic (PLEG): container finished" podID="147336fb-662e-4c52-b4ea-d15280858c0b" containerID="e582a6d13146410466a491b737b47fde9511066d9e9d5b365f72375d7e700246" exitCode=0 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196656 4669 generic.go:334] "Generic (PLEG): container finished" podID="147336fb-662e-4c52-b4ea-d15280858c0b" containerID="f949451c28ef1f3d0387b0eaa555611b9cbacb81b7af64780eaeb87d7e7b0024" exitCode=0 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196748 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerDied","Data":"98887884702e90cc6ce112b73b8b77d61fa06055512d7a7fa9402d6d337f4412"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196800 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerDied","Data":"e1eeeb51efc7f70c4e8c83b4115b6ea79127acaced4cf630d61071ae814c9ba4"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196831 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerDied","Data":"e582a6d13146410466a491b737b47fde9511066d9e9d5b365f72375d7e700246"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.196854 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerDied","Data":"f949451c28ef1f3d0387b0eaa555611b9cbacb81b7af64780eaeb87d7e7b0024"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.203614 4669 generic.go:334] "Generic (PLEG): container finished" podID="7ecc6d74-0b2e-4265-80ab-1229ce7427d2" containerID="e912c43cb389f4c8db6b53379d9a67dec9c39a35ffaa395cd65d08f7a4316b7f" exitCode=0 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.203672 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tjtxx" event={"ID":"7ecc6d74-0b2e-4265-80ab-1229ce7427d2","Type":"ContainerDied","Data":"e912c43cb389f4c8db6b53379d9a67dec9c39a35ffaa395cd65d08f7a4316b7f"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.205569 4669 generic.go:334] "Generic (PLEG): container finished" podID="fbed2eec-1d0d-49aa-89b2-69b961ee76b8" containerID="47c2d4a9b75cdf000f25a82826567661d74f9443d269eb51bfecf4852e9aeed0" exitCode=0 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.205703 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-blx6m" event={"ID":"fbed2eec-1d0d-49aa-89b2-69b961ee76b8","Type":"ContainerDied","Data":"47c2d4a9b75cdf000f25a82826567661d74f9443d269eb51bfecf4852e9aeed0"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.208231 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"42082ba5-0485-486f-8b1c-17cf7c0fc405","Type":"ContainerStarted","Data":"9fd048488eadc19c72a0bbfab128d8ea367a35df614a121dd57a091480056819"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.208367 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.210650 4669 generic.go:334] "Generic (PLEG): container finished" podID="9f5f87b9-2c24-4a27-9a32-1258486274ac" containerID="d1615ca45806c11e63a66a795bded109fa28ef3aae30af9a8081028f5fb5fe78" exitCode=0 Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.210732 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lg6tp" event={"ID":"9f5f87b9-2c24-4a27-9a32-1258486274ac","Type":"ContainerDied","Data":"d1615ca45806c11e63a66a795bded109fa28ef3aae30af9a8081028f5fb5fe78"} Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.256728 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=11.256700505 podStartE2EDuration="11.256700505s" podCreationTimestamp="2025-10-08 21:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:01:57.250295957 +0000 UTC m=+1036.943106630" watchObservedRunningTime="2025-10-08 21:01:57.256700505 +0000 UTC m=+1036.949511208" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.345347 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" path="/var/lib/kubelet/pods/a6cc8710-2436-467e-9f6f-17e820c82294/volumes" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.346022 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" path="/var/lib/kubelet/pods/f67695c6-cc78-4e93-86e4-34b030405e0e/volumes" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.572172 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f47799b5d-n27h7" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.609266 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719171 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-sg-core-conf-yaml\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719235 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfc64\" (UniqueName: \"kubernetes.io/projected/147336fb-662e-4c52-b4ea-d15280858c0b-kube-api-access-jfc64\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719257 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-config-data\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719273 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-combined-ca-bundle\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719294 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-run-httpd\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719312 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-scripts\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.719373 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-log-httpd\") pod \"147336fb-662e-4c52-b4ea-d15280858c0b\" (UID: \"147336fb-662e-4c52-b4ea-d15280858c0b\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.720878 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.721496 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.731695 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-scripts" (OuterVolumeSpecName: "scripts") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.732086 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147336fb-662e-4c52-b4ea-d15280858c0b-kube-api-access-jfc64" (OuterVolumeSpecName: "kube-api-access-jfc64") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "kube-api-access-jfc64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.757591 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.809583 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.822992 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfc64\" (UniqueName: \"kubernetes.io/projected/147336fb-662e-4c52-b4ea-d15280858c0b-kube-api-access-jfc64\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.823013 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.823021 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.823029 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/147336fb-662e-4c52-b4ea-d15280858c0b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.823036 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.878826 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.906417 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-config-data" (OuterVolumeSpecName: "config-data") pod "147336fb-662e-4c52-b4ea-d15280858c0b" (UID: "147336fb-662e-4c52-b4ea-d15280858c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.923729 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-ovndb-tls-certs\") pod \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.923782 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4brl\" (UniqueName: \"kubernetes.io/projected/67dd40b0-efa4-47ba-814f-57dcb053a2d9-kube-api-access-b4brl\") pod \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.923825 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-config\") pod \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.923859 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-httpd-config\") pod \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.924009 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-combined-ca-bundle\") pod \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\" (UID: \"67dd40b0-efa4-47ba-814f-57dcb053a2d9\") " Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.924476 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.924498 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147336fb-662e-4c52-b4ea-d15280858c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.928575 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dd40b0-efa4-47ba-814f-57dcb053a2d9-kube-api-access-b4brl" (OuterVolumeSpecName: "kube-api-access-b4brl") pod "67dd40b0-efa4-47ba-814f-57dcb053a2d9" (UID: "67dd40b0-efa4-47ba-814f-57dcb053a2d9"). InnerVolumeSpecName "kube-api-access-b4brl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.929809 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "67dd40b0-efa4-47ba-814f-57dcb053a2d9" (UID: "67dd40b0-efa4-47ba-814f-57dcb053a2d9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.974327 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-config" (OuterVolumeSpecName: "config") pod "67dd40b0-efa4-47ba-814f-57dcb053a2d9" (UID: "67dd40b0-efa4-47ba-814f-57dcb053a2d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:57 crc kubenswrapper[4669]: I1008 21:01:57.979083 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67dd40b0-efa4-47ba-814f-57dcb053a2d9" (UID: "67dd40b0-efa4-47ba-814f-57dcb053a2d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.006970 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "67dd40b0-efa4-47ba-814f-57dcb053a2d9" (UID: "67dd40b0-efa4-47ba-814f-57dcb053a2d9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.025956 4669 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.025989 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4brl\" (UniqueName: \"kubernetes.io/projected/67dd40b0-efa4-47ba-814f-57dcb053a2d9-kube-api-access-b4brl\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.026002 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.026011 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.026020 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd40b0-efa4-47ba-814f-57dcb053a2d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.029205 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127062 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data\") pod \"1a987589-96bc-46ca-b779-71aae81674e7\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127492 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-scripts\") pod \"1a987589-96bc-46ca-b779-71aae81674e7\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127552 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjmnq\" (UniqueName: \"kubernetes.io/projected/1a987589-96bc-46ca-b779-71aae81674e7-kube-api-access-sjmnq\") pod \"1a987589-96bc-46ca-b779-71aae81674e7\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127801 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a987589-96bc-46ca-b779-71aae81674e7-etc-machine-id\") pod \"1a987589-96bc-46ca-b779-71aae81674e7\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127846 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data-custom\") pod \"1a987589-96bc-46ca-b779-71aae81674e7\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127902 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-combined-ca-bundle\") pod \"1a987589-96bc-46ca-b779-71aae81674e7\" (UID: \"1a987589-96bc-46ca-b779-71aae81674e7\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.127982 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a987589-96bc-46ca-b779-71aae81674e7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1a987589-96bc-46ca-b779-71aae81674e7" (UID: "1a987589-96bc-46ca-b779-71aae81674e7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.128283 4669 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a987589-96bc-46ca-b779-71aae81674e7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.131217 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-scripts" (OuterVolumeSpecName: "scripts") pod "1a987589-96bc-46ca-b779-71aae81674e7" (UID: "1a987589-96bc-46ca-b779-71aae81674e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.131337 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a987589-96bc-46ca-b779-71aae81674e7-kube-api-access-sjmnq" (OuterVolumeSpecName: "kube-api-access-sjmnq") pod "1a987589-96bc-46ca-b779-71aae81674e7" (UID: "1a987589-96bc-46ca-b779-71aae81674e7"). InnerVolumeSpecName "kube-api-access-sjmnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.131490 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a987589-96bc-46ca-b779-71aae81674e7" (UID: "1a987589-96bc-46ca-b779-71aae81674e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.180364 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a987589-96bc-46ca-b779-71aae81674e7" (UID: "1a987589-96bc-46ca-b779-71aae81674e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.220224 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data" (OuterVolumeSpecName: "config-data") pod "1a987589-96bc-46ca-b779-71aae81674e7" (UID: "1a987589-96bc-46ca-b779-71aae81674e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.224870 4669 generic.go:334] "Generic (PLEG): container finished" podID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerID="b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a" exitCode=0 Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.224923 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8986bb9b-rq2pj" event={"ID":"67dd40b0-efa4-47ba-814f-57dcb053a2d9","Type":"ContainerDied","Data":"b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a"} Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.224949 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8986bb9b-rq2pj" event={"ID":"67dd40b0-efa4-47ba-814f-57dcb053a2d9","Type":"ContainerDied","Data":"6e0752c1d7e9a538bd416d28882ae5902d4f75fd2fe0fb0d1fa6efddf055c00c"} Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.224964 4669 scope.go:117] "RemoveContainer" containerID="d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.225133 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8986bb9b-rq2pj" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.234670 4669 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.234703 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.234717 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.234728 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a987589-96bc-46ca-b779-71aae81674e7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.234740 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjmnq\" (UniqueName: \"kubernetes.io/projected/1a987589-96bc-46ca-b779-71aae81674e7-kube-api-access-sjmnq\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.235647 4669 generic.go:334] "Generic (PLEG): container finished" podID="1a987589-96bc-46ca-b779-71aae81674e7" containerID="1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba" exitCode=0 Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.235724 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a987589-96bc-46ca-b779-71aae81674e7","Type":"ContainerDied","Data":"1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba"} Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.235742 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.235751 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a987589-96bc-46ca-b779-71aae81674e7","Type":"ContainerDied","Data":"3c1e3b0d3ba0ae4acf437b074a0e1d08a87d7cac94e94953e805ae8328aae5f0"} Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.240799 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.245351 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"147336fb-662e-4c52-b4ea-d15280858c0b","Type":"ContainerDied","Data":"25ebd7f10f1da5c27cb71706005530fa087bb42ca80e754ae4fc0eec9e7c64ea"} Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.272712 4669 scope.go:117] "RemoveContainer" containerID="b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.287040 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f8986bb9b-rq2pj"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.295985 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f8986bb9b-rq2pj"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.308951 4669 scope.go:117] "RemoveContainer" containerID="d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.309105 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.309554 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f\": container with ID starting with d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f not found: ID does not exist" containerID="d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.309590 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f"} err="failed to get container status \"d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f\": rpc error: code = NotFound desc = could not find container \"d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f\": container with ID starting with d512918ad4f324d7044a601ca483a03d5044d9fc3339738de53935b6e63d4f7f not found: ID does not exist" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.309617 4669 scope.go:117] "RemoveContainer" containerID="b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.312910 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a\": container with ID starting with b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a not found: ID does not exist" containerID="b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.312950 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a"} err="failed to get container status \"b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a\": rpc error: code = NotFound desc = could not find container \"b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a\": container with ID starting with b3cd31585701575c0b07e7772001ea41a665b9d37746d7836cf752dae1d7890a not found: ID does not exist" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.312971 4669 scope.go:117] "RemoveContainer" containerID="b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.322177 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.337022 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.348394 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.357881 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358232 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358243 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358254 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-central-agent" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358261 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-central-agent" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358270 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="cinder-scheduler" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358276 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="cinder-scheduler" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358291 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-notification-agent" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358297 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-notification-agent" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358307 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" containerName="dnsmasq-dns" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358312 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" containerName="dnsmasq-dns" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358337 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" containerName="init" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358351 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" containerName="init" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358364 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="proxy-httpd" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358370 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="proxy-httpd" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358379 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="probe" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358384 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="probe" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358396 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon-log" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358401 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon-log" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358411 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-api" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358416 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-api" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358426 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-httpd" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358432 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-httpd" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.358442 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="sg-core" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358447 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="sg-core" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358606 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cc8710-2436-467e-9f6f-17e820c82294" containerName="dnsmasq-dns" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358618 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="sg-core" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358631 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon-log" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358643 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-httpd" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358656 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" containerName="neutron-api" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358667 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67695c6-cc78-4e93-86e4-34b030405e0e" containerName="horizon" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358675 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="cinder-scheduler" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358685 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="proxy-httpd" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358692 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-central-agent" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358705 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a987589-96bc-46ca-b779-71aae81674e7" containerName="probe" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.358717 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" containerName="ceilometer-notification-agent" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.360241 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.362950 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.363000 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.373043 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.387708 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.389074 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.391873 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.392452 4669 scope.go:117] "RemoveContainer" containerID="1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.394903 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-config-data\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441121 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-scripts\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441166 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdr9w\" (UniqueName: \"kubernetes.io/projected/3e6cc85e-3a29-4567-96b8-2de40d300ca7-kube-api-access-wdr9w\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441193 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441246 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-run-httpd\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441281 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-log-httpd\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.441311 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.442447 4669 scope.go:117] "RemoveContainer" containerID="b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.443132 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e\": container with ID starting with b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e not found: ID does not exist" containerID="b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.443179 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e"} err="failed to get container status \"b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e\": rpc error: code = NotFound desc = could not find container \"b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e\": container with ID starting with b0e5a2e34c0952e001f079762655ef6295836daff51c6aa3c02b0f9ed639495e not found: ID does not exist" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.443223 4669 scope.go:117] "RemoveContainer" containerID="1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.443965 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba\": container with ID starting with 1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba not found: ID does not exist" containerID="1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.444000 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba"} err="failed to get container status \"1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba\": rpc error: code = NotFound desc = could not find container \"1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba\": container with ID starting with 1972c9efd499629622160bfbf61499e8508c5346f15097ba6858e957fbe2b8ba not found: ID does not exist" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.444021 4669 scope.go:117] "RemoveContainer" containerID="98887884702e90cc6ce112b73b8b77d61fa06055512d7a7fa9402d6d337f4412" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.515012 4669 scope.go:117] "RemoveContainer" containerID="e1eeeb51efc7f70c4e8c83b4115b6ea79127acaced4cf630d61071ae814c9ba4" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542660 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3873e946-b682-46b0-9b31-c34217bed686-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542710 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-run-httpd\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542759 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-log-httpd\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542884 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-config-data\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542924 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-config-data\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.542977 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-scripts\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543003 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdr9w\" (UniqueName: \"kubernetes.io/projected/3e6cc85e-3a29-4567-96b8-2de40d300ca7-kube-api-access-wdr9w\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543089 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543137 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-scripts\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543152 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ch89\" (UniqueName: \"kubernetes.io/projected/3873e946-b682-46b0-9b31-c34217bed686-kube-api-access-5ch89\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543692 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-run-httpd\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.543962 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-log-httpd\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.554324 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.555054 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-config-data\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.556879 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-scripts\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.557331 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.570672 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdr9w\" (UniqueName: \"kubernetes.io/projected/3e6cc85e-3a29-4567-96b8-2de40d300ca7-kube-api-access-wdr9w\") pod \"ceilometer-0\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.646013 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.646051 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.646092 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-scripts\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.646106 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ch89\" (UniqueName: \"kubernetes.io/projected/3873e946-b682-46b0-9b31-c34217bed686-kube-api-access-5ch89\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.646147 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3873e946-b682-46b0-9b31-c34217bed686-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.646198 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-config-data\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.649235 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3873e946-b682-46b0-9b31-c34217bed686-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.651866 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-scripts\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.652624 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.654890 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-config-data\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.658816 4669 scope.go:117] "RemoveContainer" containerID="e582a6d13146410466a491b737b47fde9511066d9e9d5b365f72375d7e700246" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.659301 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3873e946-b682-46b0-9b31-c34217bed686-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.668046 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ch89\" (UniqueName: \"kubernetes.io/projected/3873e946-b682-46b0-9b31-c34217bed686-kube-api-access-5ch89\") pod \"cinder-scheduler-0\" (UID: \"3873e946-b682-46b0-9b31-c34217bed686\") " pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: E1008 21:01:58.679965 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a\": RecentStats: unable to find data in memory cache]" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.693593 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.707928 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.794868 4669 scope.go:117] "RemoveContainer" containerID="f949451c28ef1f3d0387b0eaa555611b9cbacb81b7af64780eaeb87d7e7b0024" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.812779 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.828038 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.841941 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.951107 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m7vl\" (UniqueName: \"kubernetes.io/projected/9f5f87b9-2c24-4a27-9a32-1258486274ac-kube-api-access-7m7vl\") pod \"9f5f87b9-2c24-4a27-9a32-1258486274ac\" (UID: \"9f5f87b9-2c24-4a27-9a32-1258486274ac\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.951450 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfpvr\" (UniqueName: \"kubernetes.io/projected/fbed2eec-1d0d-49aa-89b2-69b961ee76b8-kube-api-access-hfpvr\") pod \"fbed2eec-1d0d-49aa-89b2-69b961ee76b8\" (UID: \"fbed2eec-1d0d-49aa-89b2-69b961ee76b8\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.951644 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr757\" (UniqueName: \"kubernetes.io/projected/7ecc6d74-0b2e-4265-80ab-1229ce7427d2-kube-api-access-pr757\") pod \"7ecc6d74-0b2e-4265-80ab-1229ce7427d2\" (UID: \"7ecc6d74-0b2e-4265-80ab-1229ce7427d2\") " Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.956139 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5f87b9-2c24-4a27-9a32-1258486274ac-kube-api-access-7m7vl" (OuterVolumeSpecName: "kube-api-access-7m7vl") pod "9f5f87b9-2c24-4a27-9a32-1258486274ac" (UID: "9f5f87b9-2c24-4a27-9a32-1258486274ac"). InnerVolumeSpecName "kube-api-access-7m7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.956786 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ecc6d74-0b2e-4265-80ab-1229ce7427d2-kube-api-access-pr757" (OuterVolumeSpecName: "kube-api-access-pr757") pod "7ecc6d74-0b2e-4265-80ab-1229ce7427d2" (UID: "7ecc6d74-0b2e-4265-80ab-1229ce7427d2"). InnerVolumeSpecName "kube-api-access-pr757". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.962478 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbed2eec-1d0d-49aa-89b2-69b961ee76b8-kube-api-access-hfpvr" (OuterVolumeSpecName: "kube-api-access-hfpvr") pod "fbed2eec-1d0d-49aa-89b2-69b961ee76b8" (UID: "fbed2eec-1d0d-49aa-89b2-69b961ee76b8"). InnerVolumeSpecName "kube-api-access-hfpvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.996749 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.996990 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-log" containerID="cri-o://41a0dd29617e88f9d18a558336ce519f0f858cb475265b82d7475c198ff3b493" gracePeriod=30 Oct 08 21:01:58 crc kubenswrapper[4669]: I1008 21:01:58.997129 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-httpd" containerID="cri-o://e2d48d5cc296d13b7707a3140b030582aa094b3374277399c2e138ec3fefa784" gracePeriod=30 Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.053991 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr757\" (UniqueName: \"kubernetes.io/projected/7ecc6d74-0b2e-4265-80ab-1229ce7427d2-kube-api-access-pr757\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.054034 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m7vl\" (UniqueName: \"kubernetes.io/projected/9f5f87b9-2c24-4a27-9a32-1258486274ac-kube-api-access-7m7vl\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.054046 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfpvr\" (UniqueName: \"kubernetes.io/projected/fbed2eec-1d0d-49aa-89b2-69b961ee76b8-kube-api-access-hfpvr\") on node \"crc\" DevicePath \"\"" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.259050 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lg6tp" event={"ID":"9f5f87b9-2c24-4a27-9a32-1258486274ac","Type":"ContainerDied","Data":"0ff11e0fba3a01b3929b5fd24d6fb3aa45b767dcb3594e617b28fec6aa6d9258"} Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.259089 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff11e0fba3a01b3929b5fd24d6fb3aa45b767dcb3594e617b28fec6aa6d9258" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.259145 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lg6tp" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.279379 4669 generic.go:334] "Generic (PLEG): container finished" podID="1c34a743-b5df-4b29-847f-521f7086fa81" containerID="41a0dd29617e88f9d18a558336ce519f0f858cb475265b82d7475c198ff3b493" exitCode=143 Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.279584 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c34a743-b5df-4b29-847f-521f7086fa81","Type":"ContainerDied","Data":"41a0dd29617e88f9d18a558336ce519f0f858cb475265b82d7475c198ff3b493"} Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.288141 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tjtxx" event={"ID":"7ecc6d74-0b2e-4265-80ab-1229ce7427d2","Type":"ContainerDied","Data":"71fd6f54efa66db63e2590d5488bb595a7a1391cce5f5cb01ce70d3277ca290d"} Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.288166 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tjtxx" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.288179 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71fd6f54efa66db63e2590d5488bb595a7a1391cce5f5cb01ce70d3277ca290d" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.289817 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-blx6m" event={"ID":"fbed2eec-1d0d-49aa-89b2-69b961ee76b8","Type":"ContainerDied","Data":"a38365c4697e83b538c7fe835cc4741f3860ba9209e8714a5a37f211f10bf2d9"} Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.289855 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a38365c4697e83b538c7fe835cc4741f3860ba9209e8714a5a37f211f10bf2d9" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.289908 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-blx6m" Oct 08 21:01:59 crc kubenswrapper[4669]: W1008 21:01:59.334968 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3873e946_b682_46b0_9b31_c34217bed686.slice/crio-a500eeb3d9b0f478c7b7690c958ee599ff384ebbb6b2689aa732ef1433c24388 WatchSource:0}: Error finding container a500eeb3d9b0f478c7b7690c958ee599ff384ebbb6b2689aa732ef1433c24388: Status 404 returned error can't find the container with id a500eeb3d9b0f478c7b7690c958ee599ff384ebbb6b2689aa732ef1433c24388 Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.341055 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147336fb-662e-4c52-b4ea-d15280858c0b" path="/var/lib/kubelet/pods/147336fb-662e-4c52-b4ea-d15280858c0b/volumes" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.341905 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a987589-96bc-46ca-b779-71aae81674e7" path="/var/lib/kubelet/pods/1a987589-96bc-46ca-b779-71aae81674e7/volumes" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.342798 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dd40b0-efa4-47ba-814f-57dcb053a2d9" path="/var/lib/kubelet/pods/67dd40b0-efa4-47ba-814f-57dcb053a2d9/volumes" Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.344391 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 21:01:59 crc kubenswrapper[4669]: I1008 21:01:59.353351 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:01:59 crc kubenswrapper[4669]: W1008 21:01:59.356498 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6cc85e_3a29_4567_96b8_2de40d300ca7.slice/crio-be7ebfb245caf51f91de72cda8b1346e0af7edff4b413265e15d5d4fcb05f766 WatchSource:0}: Error finding container be7ebfb245caf51f91de72cda8b1346e0af7edff4b413265e15d5d4fcb05f766: Status 404 returned error can't find the container with id be7ebfb245caf51f91de72cda8b1346e0af7edff4b413265e15d5d4fcb05f766 Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.307102 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerStarted","Data":"ef3ebfe38f71ff1958823c5c287522a17052c7de0c0dbb96a55de18b106d969a"} Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.307430 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerStarted","Data":"be7ebfb245caf51f91de72cda8b1346e0af7edff4b413265e15d5d4fcb05f766"} Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.311409 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3873e946-b682-46b0-9b31-c34217bed686","Type":"ContainerStarted","Data":"7721ed4350e8f0e273018de183f1f4d733e1a8f2228fbb0d2336416028c03b16"} Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.311462 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3873e946-b682-46b0-9b31-c34217bed686","Type":"ContainerStarted","Data":"a500eeb3d9b0f478c7b7690c958ee599ff384ebbb6b2689aa732ef1433c24388"} Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.583166 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.583613 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-log" containerID="cri-o://bd10f33f850129cd398e2d48ffb557d3d17770c379af7dee219d67831f475921" gracePeriod=30 Oct 08 21:02:00 crc kubenswrapper[4669]: I1008 21:02:00.583752 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-httpd" containerID="cri-o://786145395f9c0a079cd597228d501a90e1877725dffaadb6a783c5e45825e93e" gracePeriod=30 Oct 08 21:02:01 crc kubenswrapper[4669]: I1008 21:02:01.323179 4669 generic.go:334] "Generic (PLEG): container finished" podID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerID="bd10f33f850129cd398e2d48ffb557d3d17770c379af7dee219d67831f475921" exitCode=143 Oct 08 21:02:01 crc kubenswrapper[4669]: I1008 21:02:01.323289 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab","Type":"ContainerDied","Data":"bd10f33f850129cd398e2d48ffb557d3d17770c379af7dee219d67831f475921"} Oct 08 21:02:01 crc kubenswrapper[4669]: I1008 21:02:01.325430 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3873e946-b682-46b0-9b31-c34217bed686","Type":"ContainerStarted","Data":"b54b0d82c136068329c83a8e44f4cdc7ec5fdc48b6acbfd1fd187e5798b4cf14"} Oct 08 21:02:01 crc kubenswrapper[4669]: I1008 21:02:01.327607 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerStarted","Data":"68200dd7ed67f9c07e099797a8e52944bd4a53f72bd1d77f3a7f8f8d4006599c"} Oct 08 21:02:01 crc kubenswrapper[4669]: I1008 21:02:01.361990 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.361970933 podStartE2EDuration="3.361970933s" podCreationTimestamp="2025-10-08 21:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:01.34885239 +0000 UTC m=+1041.041663073" watchObservedRunningTime="2025-10-08 21:02:01.361970933 +0000 UTC m=+1041.054781606" Oct 08 21:02:01 crc kubenswrapper[4669]: I1008 21:02:01.429428 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.342919 4669 generic.go:334] "Generic (PLEG): container finished" podID="1c34a743-b5df-4b29-847f-521f7086fa81" containerID="e2d48d5cc296d13b7707a3140b030582aa094b3374277399c2e138ec3fefa784" exitCode=0 Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.343011 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c34a743-b5df-4b29-847f-521f7086fa81","Type":"ContainerDied","Data":"e2d48d5cc296d13b7707a3140b030582aa094b3374277399c2e138ec3fefa784"} Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.352945 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerStarted","Data":"a611dbe99b2b9a1ab05c3f8dbc11695bec31bb872eee5c9ac7e0e03f2a9bf968"} Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.698755 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.831983 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.832930 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-logs\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833081 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-combined-ca-bundle\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833125 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-config-data\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833189 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-public-tls-certs\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833211 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-scripts\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833276 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-httpd-run\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833283 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-logs" (OuterVolumeSpecName: "logs") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfsdm\" (UniqueName: \"kubernetes.io/projected/1c34a743-b5df-4b29-847f-521f7086fa81-kube-api-access-tfsdm\") pod \"1c34a743-b5df-4b29-847f-521f7086fa81\" (UID: \"1c34a743-b5df-4b29-847f-521f7086fa81\") " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833800 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.833996 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.834033 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c34a743-b5df-4b29-847f-521f7086fa81-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.844778 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.855727 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c34a743-b5df-4b29-847f-521f7086fa81-kube-api-access-tfsdm" (OuterVolumeSpecName: "kube-api-access-tfsdm") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "kube-api-access-tfsdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.858655 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-scripts" (OuterVolumeSpecName: "scripts") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.878275 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.925509 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-config-data" (OuterVolumeSpecName: "config-data") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.928607 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1c34a743-b5df-4b29-847f-521f7086fa81" (UID: "1c34a743-b5df-4b29-847f-521f7086fa81"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.935805 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.935842 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.935854 4669 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.935867 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c34a743-b5df-4b29-847f-521f7086fa81-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.935879 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfsdm\" (UniqueName: \"kubernetes.io/projected/1c34a743-b5df-4b29-847f-521f7086fa81-kube-api-access-tfsdm\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.935920 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 21:02:02 crc kubenswrapper[4669]: I1008 21:02:02.962678 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.037477 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.365672 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerStarted","Data":"c3a915140c1cf43a9b2bb277e52f405b7075617df2c45c7ecc4292749e86a822"} Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.365905 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="sg-core" containerID="cri-o://a611dbe99b2b9a1ab05c3f8dbc11695bec31bb872eee5c9ac7e0e03f2a9bf968" gracePeriod=30 Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.365841 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-central-agent" containerID="cri-o://ef3ebfe38f71ff1958823c5c287522a17052c7de0c0dbb96a55de18b106d969a" gracePeriod=30 Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.366010 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-notification-agent" containerID="cri-o://68200dd7ed67f9c07e099797a8e52944bd4a53f72bd1d77f3a7f8f8d4006599c" gracePeriod=30 Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.365883 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="proxy-httpd" containerID="cri-o://c3a915140c1cf43a9b2bb277e52f405b7075617df2c45c7ecc4292749e86a822" gracePeriod=30 Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.366085 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.377106 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c34a743-b5df-4b29-847f-521f7086fa81","Type":"ContainerDied","Data":"9801dac8d0f73897d33b0c2b9f725769f2b58561e7b87c53152914b0524e9bbf"} Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.377155 4669 scope.go:117] "RemoveContainer" containerID="e2d48d5cc296d13b7707a3140b030582aa094b3374277399c2e138ec3fefa784" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.377303 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.403643 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6840306790000001 podStartE2EDuration="5.403617713s" podCreationTimestamp="2025-10-08 21:01:58 +0000 UTC" firstStartedPulling="2025-10-08 21:01:59.358510234 +0000 UTC m=+1039.051320907" lastFinishedPulling="2025-10-08 21:02:03.078097268 +0000 UTC m=+1042.770907941" observedRunningTime="2025-10-08 21:02:03.390620893 +0000 UTC m=+1043.083431576" watchObservedRunningTime="2025-10-08 21:02:03.403617713 +0000 UTC m=+1043.096428406" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.407615 4669 scope.go:117] "RemoveContainer" containerID="41a0dd29617e88f9d18a558336ce519f0f858cb475265b82d7475c198ff3b493" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.417461 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.428591 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.444463 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:02:03 crc kubenswrapper[4669]: E1008 21:02:03.444903 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-log" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.444924 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-log" Oct 08 21:02:03 crc kubenswrapper[4669]: E1008 21:02:03.444942 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecc6d74-0b2e-4265-80ab-1229ce7427d2" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.444952 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecc6d74-0b2e-4265-80ab-1229ce7427d2" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: E1008 21:02:03.444979 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-httpd" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.444987 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-httpd" Oct 08 21:02:03 crc kubenswrapper[4669]: E1008 21:02:03.445011 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbed2eec-1d0d-49aa-89b2-69b961ee76b8" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445019 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbed2eec-1d0d-49aa-89b2-69b961ee76b8" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: E1008 21:02:03.445036 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5f87b9-2c24-4a27-9a32-1258486274ac" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445044 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5f87b9-2c24-4a27-9a32-1258486274ac" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445265 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-log" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445293 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5f87b9-2c24-4a27-9a32-1258486274ac" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445308 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" containerName="glance-httpd" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445328 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecc6d74-0b2e-4265-80ab-1229ce7427d2" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.445342 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbed2eec-1d0d-49aa-89b2-69b961ee76b8" containerName="mariadb-database-create" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.446453 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.451080 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.451392 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.453966 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544265 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvl72\" (UniqueName: \"kubernetes.io/projected/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-kube-api-access-gvl72\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544341 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544371 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544396 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544509 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544567 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.544598 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-logs\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.645815 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.645866 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvl72\" (UniqueName: \"kubernetes.io/projected/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-kube-api-access-gvl72\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.645911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.645928 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.645948 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.646045 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.646074 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.646099 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-logs\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.646460 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.646564 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-logs\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.647224 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.653849 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.655834 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.657135 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.660657 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.666734 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvl72\" (UniqueName: \"kubernetes.io/projected/8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e-kube-api-access-gvl72\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.685802 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e\") " pod="openstack/glance-default-external-api-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.708407 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 21:02:03 crc kubenswrapper[4669]: I1008 21:02:03.771270 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.399718 4669 generic.go:334] "Generic (PLEG): container finished" podID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerID="786145395f9c0a079cd597228d501a90e1877725dffaadb6a783c5e45825e93e" exitCode=0 Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.400096 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab","Type":"ContainerDied","Data":"786145395f9c0a079cd597228d501a90e1877725dffaadb6a783c5e45825e93e"} Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414655 4669 generic.go:334] "Generic (PLEG): container finished" podID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerID="c3a915140c1cf43a9b2bb277e52f405b7075617df2c45c7ecc4292749e86a822" exitCode=0 Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414690 4669 generic.go:334] "Generic (PLEG): container finished" podID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerID="a611dbe99b2b9a1ab05c3f8dbc11695bec31bb872eee5c9ac7e0e03f2a9bf968" exitCode=2 Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414699 4669 generic.go:334] "Generic (PLEG): container finished" podID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerID="68200dd7ed67f9c07e099797a8e52944bd4a53f72bd1d77f3a7f8f8d4006599c" exitCode=0 Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414709 4669 generic.go:334] "Generic (PLEG): container finished" podID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerID="ef3ebfe38f71ff1958823c5c287522a17052c7de0c0dbb96a55de18b106d969a" exitCode=0 Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414730 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerDied","Data":"c3a915140c1cf43a9b2bb277e52f405b7075617df2c45c7ecc4292749e86a822"} Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414759 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerDied","Data":"a611dbe99b2b9a1ab05c3f8dbc11695bec31bb872eee5c9ac7e0e03f2a9bf968"} Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414772 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerDied","Data":"68200dd7ed67f9c07e099797a8e52944bd4a53f72bd1d77f3a7f8f8d4006599c"} Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414784 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerDied","Data":"ef3ebfe38f71ff1958823c5c287522a17052c7de0c0dbb96a55de18b106d969a"} Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e6cc85e-3a29-4567-96b8-2de40d300ca7","Type":"ContainerDied","Data":"be7ebfb245caf51f91de72cda8b1346e0af7edff4b413265e15d5d4fcb05f766"} Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.414806 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be7ebfb245caf51f91de72cda8b1346e0af7edff4b413265e15d5d4fcb05f766" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.417179 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.471919 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.472463 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569227 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-combined-ca-bundle\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569271 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-logs\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569294 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-log-httpd\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569317 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-scripts\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569364 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-config-data\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569385 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-sg-core-conf-yaml\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569406 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-combined-ca-bundle\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569424 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdr9w\" (UniqueName: \"kubernetes.io/projected/3e6cc85e-3a29-4567-96b8-2de40d300ca7-kube-api-access-wdr9w\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569480 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569510 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-httpd-run\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569624 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-config-data\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569662 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-run-httpd\") pod \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\" (UID: \"3e6cc85e-3a29-4567-96b8-2de40d300ca7\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569695 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-scripts\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569711 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-internal-tls-certs\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.569779 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zxmq\" (UniqueName: \"kubernetes.io/projected/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-kube-api-access-5zxmq\") pod \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\" (UID: \"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab\") " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.573135 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.573603 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-logs" (OuterVolumeSpecName: "logs") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.573914 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.573935 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.592773 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-kube-api-access-5zxmq" (OuterVolumeSpecName: "kube-api-access-5zxmq") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "kube-api-access-5zxmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.592887 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-scripts" (OuterVolumeSpecName: "scripts") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.592944 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6cc85e-3a29-4567-96b8-2de40d300ca7-kube-api-access-wdr9w" (OuterVolumeSpecName: "kube-api-access-wdr9w") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "kube-api-access-wdr9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.592994 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.622702 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-scripts" (OuterVolumeSpecName: "scripts") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.641155 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.655271 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672847 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672885 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672899 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672911 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672924 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zxmq\" (UniqueName: \"kubernetes.io/projected/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-kube-api-access-5zxmq\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672937 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672945 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672953 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6cc85e-3a29-4567-96b8-2de40d300ca7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672962 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672972 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.672983 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdr9w\" (UniqueName: \"kubernetes.io/projected/3e6cc85e-3a29-4567-96b8-2de40d300ca7-kube-api-access-wdr9w\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.711020 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.720776 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-config-data" (OuterVolumeSpecName: "config-data") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.723789 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.725297 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" (UID: "c7d7bc9d-0611-4434-86a2-7c39ab8a86ab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.769771 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-config-data" (OuterVolumeSpecName: "config-data") pod "3e6cc85e-3a29-4567-96b8-2de40d300ca7" (UID: "3e6cc85e-3a29-4567-96b8-2de40d300ca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.777186 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.777228 4669 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.777243 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.777254 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6cc85e-3a29-4567-96b8-2de40d300ca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.777266 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:04 crc kubenswrapper[4669]: I1008 21:02:04.834697 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.341948 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c34a743-b5df-4b29-847f-521f7086fa81" path="/var/lib/kubelet/pods/1c34a743-b5df-4b29-847f-521f7086fa81/volumes" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.429147 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7d7bc9d-0611-4434-86a2-7c39ab8a86ab","Type":"ContainerDied","Data":"03792e8096b3f386204bfeffdbc75d0b11d19f791dd94fe7fdf6c350c124144e"} Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.429217 4669 scope.go:117] "RemoveContainer" containerID="786145395f9c0a079cd597228d501a90e1877725dffaadb6a783c5e45825e93e" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.429395 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.439957 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.440761 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e","Type":"ContainerStarted","Data":"2b9f74b2f97378d588580616ab506b430a5871bd8ef3bf696b9d9f29f0285bc3"} Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.440806 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e","Type":"ContainerStarted","Data":"07d53733f18682c8ba4107ce5a84bd0d47f61a00bf3dcd9522e18e9568a305e1"} Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.457781 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.467224 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.483949 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.500408 4669 scope.go:117] "RemoveContainer" containerID="bd10f33f850129cd398e2d48ffb557d3d17770c379af7dee219d67831f475921" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.500853 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: E1008 21:02:05.501279 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-notification-agent" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501303 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-notification-agent" Oct 08 21:02:05 crc kubenswrapper[4669]: E1008 21:02:05.501317 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-log" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501328 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-log" Oct 08 21:02:05 crc kubenswrapper[4669]: E1008 21:02:05.501378 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-httpd" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501389 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-httpd" Oct 08 21:02:05 crc kubenswrapper[4669]: E1008 21:02:05.501399 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="proxy-httpd" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501409 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="proxy-httpd" Oct 08 21:02:05 crc kubenswrapper[4669]: E1008 21:02:05.501425 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="sg-core" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501433 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="sg-core" Oct 08 21:02:05 crc kubenswrapper[4669]: E1008 21:02:05.501445 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-central-agent" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501454 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-central-agent" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501678 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-httpd" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501704 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="sg-core" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501743 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-notification-agent" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501762 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" containerName="glance-log" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501778 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="proxy-httpd" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.501791 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" containerName="ceilometer-central-agent" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.503019 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.509872 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.509968 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.512634 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.525088 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.527850 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.530597 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.533819 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.538210 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.547949 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.602895 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-scripts\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.602959 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c63c33c-d91e-49b9-8b85-50960824149b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.602984 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603003 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fv7l\" (UniqueName: \"kubernetes.io/projected/7c63c33c-d91e-49b9-8b85-50960824149b-kube-api-access-4fv7l\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603068 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603094 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-run-httpd\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603116 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsf6\" (UniqueName: \"kubernetes.io/projected/aabd8e9e-a585-4043-87f2-e747b35723b4-kube-api-access-kgsf6\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603135 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c63c33c-d91e-49b9-8b85-50960824149b-logs\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603150 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603223 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603246 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-log-httpd\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603265 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-config-data\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603282 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603302 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.603335 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704578 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704639 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-run-httpd\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704665 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsf6\" (UniqueName: \"kubernetes.io/projected/aabd8e9e-a585-4043-87f2-e747b35723b4-kube-api-access-kgsf6\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704681 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c63c33c-d91e-49b9-8b85-50960824149b-logs\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704697 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704717 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704738 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-log-httpd\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704758 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-config-data\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704775 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704794 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704850 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-scripts\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704886 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c63c33c-d91e-49b9-8b85-50960824149b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704906 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.704923 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fv7l\" (UniqueName: \"kubernetes.io/projected/7c63c33c-d91e-49b9-8b85-50960824149b-kube-api-access-4fv7l\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.706433 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.706471 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c63c33c-d91e-49b9-8b85-50960824149b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.706766 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c63c33c-d91e-49b9-8b85-50960824149b-logs\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.707464 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-run-httpd\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.716318 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-scripts\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.718081 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.718738 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-log-httpd\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.719332 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.720288 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-config-data\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.720375 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.730994 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c63c33c-d91e-49b9-8b85-50960824149b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.733259 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.737748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fv7l\" (UniqueName: \"kubernetes.io/projected/7c63c33c-d91e-49b9-8b85-50960824149b-kube-api-access-4fv7l\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.746786 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.752857 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsf6\" (UniqueName: \"kubernetes.io/projected/aabd8e9e-a585-4043-87f2-e747b35723b4-kube-api-access-kgsf6\") pod \"ceilometer-0\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " pod="openstack/ceilometer-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.766799 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7c63c33c-d91e-49b9-8b85-50960824149b\") " pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.835490 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:05 crc kubenswrapper[4669]: I1008 21:02:05.858496 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:06 crc kubenswrapper[4669]: I1008 21:02:06.350854 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:06 crc kubenswrapper[4669]: W1008 21:02:06.437393 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c63c33c_d91e_49b9_8b85_50960824149b.slice/crio-97506819b5c2960d503a80676d1763f553d3e4ba9d3ffeee5dbb39db67277f8d WatchSource:0}: Error finding container 97506819b5c2960d503a80676d1763f553d3e4ba9d3ffeee5dbb39db67277f8d: Status 404 returned error can't find the container with id 97506819b5c2960d503a80676d1763f553d3e4ba9d3ffeee5dbb39db67277f8d Oct 08 21:02:06 crc kubenswrapper[4669]: I1008 21:02:06.440767 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 21:02:06 crc kubenswrapper[4669]: I1008 21:02:06.452525 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7c63c33c-d91e-49b9-8b85-50960824149b","Type":"ContainerStarted","Data":"97506819b5c2960d503a80676d1763f553d3e4ba9d3ffeee5dbb39db67277f8d"} Oct 08 21:02:06 crc kubenswrapper[4669]: I1008 21:02:06.454559 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerStarted","Data":"003f6da8fd1756fbbedc0f9b5559671e4fbba203fdc46df25cae3b4a05bed05d"} Oct 08 21:02:06 crc kubenswrapper[4669]: I1008 21:02:06.456838 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e","Type":"ContainerStarted","Data":"e4b71e04240f90e1efaf3d38195469e17a683bdefe99883aab51afed57ed73f6"} Oct 08 21:02:06 crc kubenswrapper[4669]: I1008 21:02:06.489741 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.489720203 podStartE2EDuration="3.489720203s" podCreationTimestamp="2025-10-08 21:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:06.480123407 +0000 UTC m=+1046.172934080" watchObservedRunningTime="2025-10-08 21:02:06.489720203 +0000 UTC m=+1046.182530876" Oct 08 21:02:07 crc kubenswrapper[4669]: I1008 21:02:07.344320 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6cc85e-3a29-4567-96b8-2de40d300ca7" path="/var/lib/kubelet/pods/3e6cc85e-3a29-4567-96b8-2de40d300ca7/volumes" Oct 08 21:02:07 crc kubenswrapper[4669]: I1008 21:02:07.345727 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d7bc9d-0611-4434-86a2-7c39ab8a86ab" path="/var/lib/kubelet/pods/c7d7bc9d-0611-4434-86a2-7c39ab8a86ab/volumes" Oct 08 21:02:07 crc kubenswrapper[4669]: I1008 21:02:07.466006 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7c63c33c-d91e-49b9-8b85-50960824149b","Type":"ContainerStarted","Data":"775a85ffc9b286dd26aad364bbebf1f441b3c7b5c362a37aee635db866c1d36c"} Oct 08 21:02:07 crc kubenswrapper[4669]: I1008 21:02:07.469230 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerStarted","Data":"6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206"} Oct 08 21:02:08 crc kubenswrapper[4669]: I1008 21:02:08.483414 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7c63c33c-d91e-49b9-8b85-50960824149b","Type":"ContainerStarted","Data":"511c0f025870103f1170af920c71930855918aca030f4f0761f91ea3cd228e71"} Oct 08 21:02:08 crc kubenswrapper[4669]: I1008 21:02:08.489796 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerStarted","Data":"a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8"} Oct 08 21:02:08 crc kubenswrapper[4669]: I1008 21:02:08.519702 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.519684378 podStartE2EDuration="3.519684378s" podCreationTimestamp="2025-10-08 21:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:08.505005071 +0000 UTC m=+1048.197815744" watchObservedRunningTime="2025-10-08 21:02:08.519684378 +0000 UTC m=+1048.212495051" Oct 08 21:02:08 crc kubenswrapper[4669]: I1008 21:02:08.906098 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 21:02:08 crc kubenswrapper[4669]: E1008 21:02:08.974636 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a\": RecentStats: unable to find data in memory cache]" Oct 08 21:02:09 crc kubenswrapper[4669]: I1008 21:02:09.504817 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerStarted","Data":"966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59"} Oct 08 21:02:10 crc kubenswrapper[4669]: I1008 21:02:10.515880 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerStarted","Data":"d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8"} Oct 08 21:02:10 crc kubenswrapper[4669]: I1008 21:02:10.516183 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:02:10 crc kubenswrapper[4669]: I1008 21:02:10.541403 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9488770469999999 podStartE2EDuration="5.541371684s" podCreationTimestamp="2025-10-08 21:02:05 +0000 UTC" firstStartedPulling="2025-10-08 21:02:06.35992335 +0000 UTC m=+1046.052734023" lastFinishedPulling="2025-10-08 21:02:09.952417977 +0000 UTC m=+1049.645228660" observedRunningTime="2025-10-08 21:02:10.539637546 +0000 UTC m=+1050.232448229" watchObservedRunningTime="2025-10-08 21:02:10.541371684 +0000 UTC m=+1050.234182407" Oct 08 21:02:13 crc kubenswrapper[4669]: I1008 21:02:13.186030 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:02:13 crc kubenswrapper[4669]: I1008 21:02:13.186409 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:02:13 crc kubenswrapper[4669]: I1008 21:02:13.772318 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 21:02:13 crc kubenswrapper[4669]: I1008 21:02:13.772377 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 21:02:13 crc kubenswrapper[4669]: I1008 21:02:13.801543 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 21:02:13 crc kubenswrapper[4669]: I1008 21:02:13.822943 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.293921 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9048-account-create-lpfml"] Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.295582 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.297893 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.307624 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9048-account-create-lpfml"] Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.475853 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2v6\" (UniqueName: \"kubernetes.io/projected/3975f8d5-05d5-43bf-a697-7592cae00f76-kube-api-access-lq2v6\") pod \"nova-api-9048-account-create-lpfml\" (UID: \"3975f8d5-05d5-43bf-a697-7592cae00f76\") " pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.491771 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8c82-account-create-l8gj7"] Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.492888 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.500745 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.515055 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8c82-account-create-l8gj7"] Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.553489 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.553565 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.577642 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2v6\" (UniqueName: \"kubernetes.io/projected/3975f8d5-05d5-43bf-a697-7592cae00f76-kube-api-access-lq2v6\") pod \"nova-api-9048-account-create-lpfml\" (UID: \"3975f8d5-05d5-43bf-a697-7592cae00f76\") " pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.608884 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2v6\" (UniqueName: \"kubernetes.io/projected/3975f8d5-05d5-43bf-a697-7592cae00f76-kube-api-access-lq2v6\") pod \"nova-api-9048-account-create-lpfml\" (UID: \"3975f8d5-05d5-43bf-a697-7592cae00f76\") " pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.616409 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.679227 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7sc\" (UniqueName: \"kubernetes.io/projected/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae-kube-api-access-xh7sc\") pod \"nova-cell0-8c82-account-create-l8gj7\" (UID: \"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae\") " pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.700782 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-eed3-account-create-cxmm8"] Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.702122 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.704372 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.711274 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eed3-account-create-cxmm8"] Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.781463 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7sc\" (UniqueName: \"kubernetes.io/projected/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae-kube-api-access-xh7sc\") pod \"nova-cell0-8c82-account-create-l8gj7\" (UID: \"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae\") " pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.797891 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7sc\" (UniqueName: \"kubernetes.io/projected/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae-kube-api-access-xh7sc\") pod \"nova-cell0-8c82-account-create-l8gj7\" (UID: \"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae\") " pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.810748 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.883170 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnz9n\" (UniqueName: \"kubernetes.io/projected/600b70ca-92ae-481c-96a7-e1ad051b1a1a-kube-api-access-fnz9n\") pod \"nova-cell1-eed3-account-create-cxmm8\" (UID: \"600b70ca-92ae-481c-96a7-e1ad051b1a1a\") " pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:14 crc kubenswrapper[4669]: I1008 21:02:14.984862 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnz9n\" (UniqueName: \"kubernetes.io/projected/600b70ca-92ae-481c-96a7-e1ad051b1a1a-kube-api-access-fnz9n\") pod \"nova-cell1-eed3-account-create-cxmm8\" (UID: \"600b70ca-92ae-481c-96a7-e1ad051b1a1a\") " pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.006160 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnz9n\" (UniqueName: \"kubernetes.io/projected/600b70ca-92ae-481c-96a7-e1ad051b1a1a-kube-api-access-fnz9n\") pod \"nova-cell1-eed3-account-create-cxmm8\" (UID: \"600b70ca-92ae-481c-96a7-e1ad051b1a1a\") " pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.020660 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.160952 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9048-account-create-lpfml"] Oct 08 21:02:15 crc kubenswrapper[4669]: W1008 21:02:15.177290 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3975f8d5_05d5_43bf_a697_7592cae00f76.slice/crio-2f6deccb6cd8ed0a5d8e685a0f89c90deac7f25898ee8fa279129f2dbd641c0a WatchSource:0}: Error finding container 2f6deccb6cd8ed0a5d8e685a0f89c90deac7f25898ee8fa279129f2dbd641c0a: Status 404 returned error can't find the container with id 2f6deccb6cd8ed0a5d8e685a0f89c90deac7f25898ee8fa279129f2dbd641c0a Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.280487 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8c82-account-create-l8gj7"] Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.509857 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eed3-account-create-cxmm8"] Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.565866 4669 generic.go:334] "Generic (PLEG): container finished" podID="3975f8d5-05d5-43bf-a697-7592cae00f76" containerID="c5f8501003b90fb31b4d9162158d1b12d19c733da1a1439711057c3d576096f0" exitCode=0 Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.565954 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9048-account-create-lpfml" event={"ID":"3975f8d5-05d5-43bf-a697-7592cae00f76","Type":"ContainerDied","Data":"c5f8501003b90fb31b4d9162158d1b12d19c733da1a1439711057c3d576096f0"} Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.566022 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9048-account-create-lpfml" event={"ID":"3975f8d5-05d5-43bf-a697-7592cae00f76","Type":"ContainerStarted","Data":"2f6deccb6cd8ed0a5d8e685a0f89c90deac7f25898ee8fa279129f2dbd641c0a"} Oct 08 21:02:15 crc kubenswrapper[4669]: W1008 21:02:15.567685 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod600b70ca_92ae_481c_96a7_e1ad051b1a1a.slice/crio-41f782a39cc5a3ebe981e79cf60c045e76906d2283200d5352542f530a786719 WatchSource:0}: Error finding container 41f782a39cc5a3ebe981e79cf60c045e76906d2283200d5352542f530a786719: Status 404 returned error can't find the container with id 41f782a39cc5a3ebe981e79cf60c045e76906d2283200d5352542f530a786719 Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.572062 4669 generic.go:334] "Generic (PLEG): container finished" podID="d2e0af6a-a86e-4f7f-b00a-74f84b2eabae" containerID="edca01ffd40f9f0e1d482317359c5dd3913d1fd06fe915116a3297756d7edac1" exitCode=0 Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.572141 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8c82-account-create-l8gj7" event={"ID":"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae","Type":"ContainerDied","Data":"edca01ffd40f9f0e1d482317359c5dd3913d1fd06fe915116a3297756d7edac1"} Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.572180 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8c82-account-create-l8gj7" event={"ID":"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae","Type":"ContainerStarted","Data":"fe1272331f0b59679e4691e2a12e67db7cfc73715f59b317231e168254eaf03a"} Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.836328 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.836590 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.868858 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:15 crc kubenswrapper[4669]: I1008 21:02:15.883217 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.445019 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.582656 4669 generic.go:334] "Generic (PLEG): container finished" podID="600b70ca-92ae-481c-96a7-e1ad051b1a1a" containerID="619fbe96a4ba98a3d3b55c3917a60b81c986aa0d85cbe55de0dcccf3ed1aafe6" exitCode=0 Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.582786 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.583009 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eed3-account-create-cxmm8" event={"ID":"600b70ca-92ae-481c-96a7-e1ad051b1a1a","Type":"ContainerDied","Data":"619fbe96a4ba98a3d3b55c3917a60b81c986aa0d85cbe55de0dcccf3ed1aafe6"} Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.583060 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eed3-account-create-cxmm8" event={"ID":"600b70ca-92ae-481c-96a7-e1ad051b1a1a","Type":"ContainerStarted","Data":"41f782a39cc5a3ebe981e79cf60c045e76906d2283200d5352542f530a786719"} Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.583769 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.583797 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:16 crc kubenswrapper[4669]: I1008 21:02:16.593572 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.032722 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.120086 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq2v6\" (UniqueName: \"kubernetes.io/projected/3975f8d5-05d5-43bf-a697-7592cae00f76-kube-api-access-lq2v6\") pod \"3975f8d5-05d5-43bf-a697-7592cae00f76\" (UID: \"3975f8d5-05d5-43bf-a697-7592cae00f76\") " Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.131733 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3975f8d5-05d5-43bf-a697-7592cae00f76-kube-api-access-lq2v6" (OuterVolumeSpecName: "kube-api-access-lq2v6") pod "3975f8d5-05d5-43bf-a697-7592cae00f76" (UID: "3975f8d5-05d5-43bf-a697-7592cae00f76"). InnerVolumeSpecName "kube-api-access-lq2v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.187226 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.222871 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq2v6\" (UniqueName: \"kubernetes.io/projected/3975f8d5-05d5-43bf-a697-7592cae00f76-kube-api-access-lq2v6\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.324127 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh7sc\" (UniqueName: \"kubernetes.io/projected/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae-kube-api-access-xh7sc\") pod \"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae\" (UID: \"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae\") " Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.329751 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae-kube-api-access-xh7sc" (OuterVolumeSpecName: "kube-api-access-xh7sc") pod "d2e0af6a-a86e-4f7f-b00a-74f84b2eabae" (UID: "d2e0af6a-a86e-4f7f-b00a-74f84b2eabae"). InnerVolumeSpecName "kube-api-access-xh7sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.426487 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh7sc\" (UniqueName: \"kubernetes.io/projected/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae-kube-api-access-xh7sc\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.555220 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.555610 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-central-agent" containerID="cri-o://6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206" gracePeriod=30 Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.555698 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-notification-agent" containerID="cri-o://a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8" gracePeriod=30 Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.555693 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="sg-core" containerID="cri-o://966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59" gracePeriod=30 Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.555707 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="proxy-httpd" containerID="cri-o://d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8" gracePeriod=30 Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.592123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8c82-account-create-l8gj7" event={"ID":"d2e0af6a-a86e-4f7f-b00a-74f84b2eabae","Type":"ContainerDied","Data":"fe1272331f0b59679e4691e2a12e67db7cfc73715f59b317231e168254eaf03a"} Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.592164 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1272331f0b59679e4691e2a12e67db7cfc73715f59b317231e168254eaf03a" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.592141 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8c82-account-create-l8gj7" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.593821 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9048-account-create-lpfml" event={"ID":"3975f8d5-05d5-43bf-a697-7592cae00f76","Type":"ContainerDied","Data":"2f6deccb6cd8ed0a5d8e685a0f89c90deac7f25898ee8fa279129f2dbd641c0a"} Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.593863 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6deccb6cd8ed0a5d8e685a0f89c90deac7f25898ee8fa279129f2dbd641c0a" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.594036 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9048-account-create-lpfml" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.861676 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.937181 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnz9n\" (UniqueName: \"kubernetes.io/projected/600b70ca-92ae-481c-96a7-e1ad051b1a1a-kube-api-access-fnz9n\") pod \"600b70ca-92ae-481c-96a7-e1ad051b1a1a\" (UID: \"600b70ca-92ae-481c-96a7-e1ad051b1a1a\") " Oct 08 21:02:17 crc kubenswrapper[4669]: I1008 21:02:17.941659 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600b70ca-92ae-481c-96a7-e1ad051b1a1a-kube-api-access-fnz9n" (OuterVolumeSpecName: "kube-api-access-fnz9n") pod "600b70ca-92ae-481c-96a7-e1ad051b1a1a" (UID: "600b70ca-92ae-481c-96a7-e1ad051b1a1a"). InnerVolumeSpecName "kube-api-access-fnz9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.040674 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnz9n\" (UniqueName: \"kubernetes.io/projected/600b70ca-92ae-481c-96a7-e1ad051b1a1a-kube-api-access-fnz9n\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.604673 4669 generic.go:334] "Generic (PLEG): container finished" podID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerID="d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8" exitCode=0 Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.604711 4669 generic.go:334] "Generic (PLEG): container finished" podID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerID="966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59" exitCode=2 Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.604724 4669 generic.go:334] "Generic (PLEG): container finished" podID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerID="6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206" exitCode=0 Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.604723 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerDied","Data":"d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8"} Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.604774 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerDied","Data":"966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59"} Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.604797 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerDied","Data":"6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206"} Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.606800 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eed3-account-create-cxmm8" event={"ID":"600b70ca-92ae-481c-96a7-e1ad051b1a1a","Type":"ContainerDied","Data":"41f782a39cc5a3ebe981e79cf60c045e76906d2283200d5352542f530a786719"} Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.606832 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f782a39cc5a3ebe981e79cf60c045e76906d2283200d5352542f530a786719" Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.606863 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eed3-account-create-cxmm8" Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.814040 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.814192 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:02:18 crc kubenswrapper[4669]: I1008 21:02:18.816352 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 21:02:19 crc kubenswrapper[4669]: E1008 21:02:19.248379 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a\": RecentStats: unable to find data in memory cache]" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.620257 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-28wfx"] Oct 08 21:02:19 crc kubenswrapper[4669]: E1008 21:02:19.621314 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600b70ca-92ae-481c-96a7-e1ad051b1a1a" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.621418 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="600b70ca-92ae-481c-96a7-e1ad051b1a1a" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: E1008 21:02:19.621658 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e0af6a-a86e-4f7f-b00a-74f84b2eabae" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.621773 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e0af6a-a86e-4f7f-b00a-74f84b2eabae" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: E1008 21:02:19.621857 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3975f8d5-05d5-43bf-a697-7592cae00f76" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.621933 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3975f8d5-05d5-43bf-a697-7592cae00f76" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.622239 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3975f8d5-05d5-43bf-a697-7592cae00f76" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.622336 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="600b70ca-92ae-481c-96a7-e1ad051b1a1a" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.622431 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e0af6a-a86e-4f7f-b00a-74f84b2eabae" containerName="mariadb-account-create" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.623189 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.626514 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-swhg6" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.626565 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.626761 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.642166 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-28wfx"] Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.671478 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7mn\" (UniqueName: \"kubernetes.io/projected/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-kube-api-access-kz7mn\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.671544 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-scripts\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.671565 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.671662 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-config-data\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.772940 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-config-data\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.773075 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7mn\" (UniqueName: \"kubernetes.io/projected/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-kube-api-access-kz7mn\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.773114 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-scripts\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.773137 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.781122 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-scripts\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.781316 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-config-data\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.781669 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.791215 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7mn\" (UniqueName: \"kubernetes.io/projected/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-kube-api-access-kz7mn\") pod \"nova-cell0-conductor-db-sync-28wfx\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:19 crc kubenswrapper[4669]: I1008 21:02:19.939547 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.348368 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384367 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgsf6\" (UniqueName: \"kubernetes.io/projected/aabd8e9e-a585-4043-87f2-e747b35723b4-kube-api-access-kgsf6\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384427 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-run-httpd\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384474 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-scripts\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384560 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-sg-core-conf-yaml\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384606 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-combined-ca-bundle\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384641 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-log-httpd\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.384671 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-config-data\") pod \"aabd8e9e-a585-4043-87f2-e747b35723b4\" (UID: \"aabd8e9e-a585-4043-87f2-e747b35723b4\") " Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.385626 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.385777 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.389261 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabd8e9e-a585-4043-87f2-e747b35723b4-kube-api-access-kgsf6" (OuterVolumeSpecName: "kube-api-access-kgsf6") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "kube-api-access-kgsf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.389759 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-scripts" (OuterVolumeSpecName: "scripts") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.409909 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.459852 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.478263 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-config-data" (OuterVolumeSpecName: "config-data") pod "aabd8e9e-a585-4043-87f2-e747b35723b4" (UID: "aabd8e9e-a585-4043-87f2-e747b35723b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486329 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgsf6\" (UniqueName: \"kubernetes.io/projected/aabd8e9e-a585-4043-87f2-e747b35723b4-kube-api-access-kgsf6\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486356 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486365 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486374 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486381 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486389 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aabd8e9e-a585-4043-87f2-e747b35723b4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.486397 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd8e9e-a585-4043-87f2-e747b35723b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.526660 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-28wfx"] Oct 08 21:02:20 crc kubenswrapper[4669]: W1008 21:02:20.530205 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe4f9a2_f7a7_433e_a6e9_4811d2424259.slice/crio-ba0505a1b3602ded95af0febcfe88cefa696eeb7824cd230ba4da49d77e960f5 WatchSource:0}: Error finding container ba0505a1b3602ded95af0febcfe88cefa696eeb7824cd230ba4da49d77e960f5: Status 404 returned error can't find the container with id ba0505a1b3602ded95af0febcfe88cefa696eeb7824cd230ba4da49d77e960f5 Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.634923 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-28wfx" event={"ID":"1fe4f9a2-f7a7-433e-a6e9-4811d2424259","Type":"ContainerStarted","Data":"ba0505a1b3602ded95af0febcfe88cefa696eeb7824cd230ba4da49d77e960f5"} Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.638211 4669 generic.go:334] "Generic (PLEG): container finished" podID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerID="a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8" exitCode=0 Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.638247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerDied","Data":"a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8"} Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.638275 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aabd8e9e-a585-4043-87f2-e747b35723b4","Type":"ContainerDied","Data":"003f6da8fd1756fbbedc0f9b5559671e4fbba203fdc46df25cae3b4a05bed05d"} Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.638295 4669 scope.go:117] "RemoveContainer" containerID="d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.638514 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.674230 4669 scope.go:117] "RemoveContainer" containerID="966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.674410 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.687948 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.705597 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.705939 4669 scope.go:117] "RemoveContainer" containerID="a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.705968 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="proxy-httpd" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.706164 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="proxy-httpd" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.706227 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-central-agent" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.706278 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-central-agent" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.706341 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="sg-core" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.706428 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="sg-core" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.706514 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-notification-agent" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.706608 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-notification-agent" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.706873 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="proxy-httpd" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.706962 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-central-agent" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.707044 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="sg-core" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.707105 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" containerName="ceilometer-notification-agent" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.709047 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.713085 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.713798 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.722078 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.732247 4669 scope.go:117] "RemoveContainer" containerID="6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.751055 4669 scope.go:117] "RemoveContainer" containerID="d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.751416 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8\": container with ID starting with d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8 not found: ID does not exist" containerID="d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.751448 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8"} err="failed to get container status \"d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8\": rpc error: code = NotFound desc = could not find container \"d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8\": container with ID starting with d54d1bda4929d0a2dca119d4bdbd776c17134257d63ed9cfe4e4716e52d654d8 not found: ID does not exist" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.751472 4669 scope.go:117] "RemoveContainer" containerID="966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.751685 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59\": container with ID starting with 966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59 not found: ID does not exist" containerID="966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.751766 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59"} err="failed to get container status \"966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59\": rpc error: code = NotFound desc = could not find container \"966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59\": container with ID starting with 966814740aebb7b366b3fe5cf9aee1ff465356457aec64c6047dc3d152daca59 not found: ID does not exist" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.751803 4669 scope.go:117] "RemoveContainer" containerID="a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.752313 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8\": container with ID starting with a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8 not found: ID does not exist" containerID="a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.752334 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8"} err="failed to get container status \"a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8\": rpc error: code = NotFound desc = could not find container \"a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8\": container with ID starting with a73f2fffe99134b40960b10ee2b822283db8da803022e690feeeefc274dc11d8 not found: ID does not exist" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.752348 4669 scope.go:117] "RemoveContainer" containerID="6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206" Oct 08 21:02:20 crc kubenswrapper[4669]: E1008 21:02:20.752561 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206\": container with ID starting with 6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206 not found: ID does not exist" containerID="6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.752580 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206"} err="failed to get container status \"6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206\": rpc error: code = NotFound desc = could not find container \"6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206\": container with ID starting with 6e50bbbac38826a7aee43f2bb616f0d2a8c1cb84be24dc63817dbadf45863206 not found: ID does not exist" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-log-httpd\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791289 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-run-httpd\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791316 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-scripts\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791374 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791403 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-config-data\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791446 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.791480 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkkgc\" (UniqueName: \"kubernetes.io/projected/493de98d-6819-413a-8f90-130f2f482cf5-kube-api-access-tkkgc\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892803 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-log-httpd\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892858 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-run-httpd\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892874 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-scripts\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892915 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892937 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-config-data\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892966 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.892991 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkkgc\" (UniqueName: \"kubernetes.io/projected/493de98d-6819-413a-8f90-130f2f482cf5-kube-api-access-tkkgc\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.893632 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-run-httpd\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.893839 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-log-httpd\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.898555 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.898654 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-scripts\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.898795 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-config-data\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.899018 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:20 crc kubenswrapper[4669]: I1008 21:02:20.915913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkkgc\" (UniqueName: \"kubernetes.io/projected/493de98d-6819-413a-8f90-130f2f482cf5-kube-api-access-tkkgc\") pod \"ceilometer-0\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " pod="openstack/ceilometer-0" Oct 08 21:02:21 crc kubenswrapper[4669]: I1008 21:02:21.028485 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:21 crc kubenswrapper[4669]: I1008 21:02:21.344461 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aabd8e9e-a585-4043-87f2-e747b35723b4" path="/var/lib/kubelet/pods/aabd8e9e-a585-4043-87f2-e747b35723b4/volumes" Oct 08 21:02:21 crc kubenswrapper[4669]: I1008 21:02:21.515185 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:21 crc kubenswrapper[4669]: W1008 21:02:21.518668 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493de98d_6819_413a_8f90_130f2f482cf5.slice/crio-ea9adf949b02b86a4c24484db4293ac9178f158a9808c2746d57929d11b4b0a8 WatchSource:0}: Error finding container ea9adf949b02b86a4c24484db4293ac9178f158a9808c2746d57929d11b4b0a8: Status 404 returned error can't find the container with id ea9adf949b02b86a4c24484db4293ac9178f158a9808c2746d57929d11b4b0a8 Oct 08 21:02:21 crc kubenswrapper[4669]: I1008 21:02:21.561100 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:21 crc kubenswrapper[4669]: I1008 21:02:21.653569 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerStarted","Data":"ea9adf949b02b86a4c24484db4293ac9178f158a9808c2746d57929d11b4b0a8"} Oct 08 21:02:22 crc kubenswrapper[4669]: I1008 21:02:22.664041 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerStarted","Data":"7e95f94a5849d422de38d75ca69f81fa0e42ba57ce3b8e3939fbd1c6cac38176"} Oct 08 21:02:24 crc kubenswrapper[4669]: I1008 21:02:24.682009 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerStarted","Data":"e2f727ddbf338a127e5e006ab13bb62dbd7688201bbb2390868c2451cb3e511f"} Oct 08 21:02:29 crc kubenswrapper[4669]: E1008 21:02:29.488058 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a\": RecentStats: unable to find data in memory cache]" Oct 08 21:02:29 crc kubenswrapper[4669]: I1008 21:02:29.743927 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerStarted","Data":"bac96369e13e726f7096199645baab99639142b8467621c3ff452fe8f64ded93"} Oct 08 21:02:29 crc kubenswrapper[4669]: I1008 21:02:29.746933 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-28wfx" event={"ID":"1fe4f9a2-f7a7-433e-a6e9-4811d2424259","Type":"ContainerStarted","Data":"0b3ddbeb7c3054f12a71af868748cddd1dc97afa361e7047de8b21e5eab29011"} Oct 08 21:02:29 crc kubenswrapper[4669]: I1008 21:02:29.768481 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-28wfx" podStartSLOduration=2.352212229 podStartE2EDuration="10.768464417s" podCreationTimestamp="2025-10-08 21:02:19 +0000 UTC" firstStartedPulling="2025-10-08 21:02:20.533121833 +0000 UTC m=+1060.225932506" lastFinishedPulling="2025-10-08 21:02:28.949374011 +0000 UTC m=+1068.642184694" observedRunningTime="2025-10-08 21:02:29.761703809 +0000 UTC m=+1069.454514482" watchObservedRunningTime="2025-10-08 21:02:29.768464417 +0000 UTC m=+1069.461275090" Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.762254 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="proxy-httpd" containerID="cri-o://d97984db882c1dcedeb6276c4a7ee58301e7e15e74a659d56b99560a70196438" gracePeriod=30 Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.762317 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="sg-core" containerID="cri-o://bac96369e13e726f7096199645baab99639142b8467621c3ff452fe8f64ded93" gracePeriod=30 Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.762441 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-notification-agent" containerID="cri-o://e2f727ddbf338a127e5e006ab13bb62dbd7688201bbb2390868c2451cb3e511f" gracePeriod=30 Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.762894 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-central-agent" containerID="cri-o://7e95f94a5849d422de38d75ca69f81fa0e42ba57ce3b8e3939fbd1c6cac38176" gracePeriod=30 Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.762170 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerStarted","Data":"d97984db882c1dcedeb6276c4a7ee58301e7e15e74a659d56b99560a70196438"} Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.764604 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:02:30 crc kubenswrapper[4669]: I1008 21:02:30.795368 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.873399065 podStartE2EDuration="10.79535152s" podCreationTimestamp="2025-10-08 21:02:20 +0000 UTC" firstStartedPulling="2025-10-08 21:02:21.52087259 +0000 UTC m=+1061.213683273" lastFinishedPulling="2025-10-08 21:02:30.442825015 +0000 UTC m=+1070.135635728" observedRunningTime="2025-10-08 21:02:30.792424439 +0000 UTC m=+1070.485235102" watchObservedRunningTime="2025-10-08 21:02:30.79535152 +0000 UTC m=+1070.488162193" Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778065 4669 generic.go:334] "Generic (PLEG): container finished" podID="493de98d-6819-413a-8f90-130f2f482cf5" containerID="d97984db882c1dcedeb6276c4a7ee58301e7e15e74a659d56b99560a70196438" exitCode=0 Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778105 4669 generic.go:334] "Generic (PLEG): container finished" podID="493de98d-6819-413a-8f90-130f2f482cf5" containerID="bac96369e13e726f7096199645baab99639142b8467621c3ff452fe8f64ded93" exitCode=2 Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778122 4669 generic.go:334] "Generic (PLEG): container finished" podID="493de98d-6819-413a-8f90-130f2f482cf5" containerID="e2f727ddbf338a127e5e006ab13bb62dbd7688201bbb2390868c2451cb3e511f" exitCode=0 Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778135 4669 generic.go:334] "Generic (PLEG): container finished" podID="493de98d-6819-413a-8f90-130f2f482cf5" containerID="7e95f94a5849d422de38d75ca69f81fa0e42ba57ce3b8e3939fbd1c6cac38176" exitCode=0 Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778145 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerDied","Data":"d97984db882c1dcedeb6276c4a7ee58301e7e15e74a659d56b99560a70196438"} Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778208 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerDied","Data":"bac96369e13e726f7096199645baab99639142b8467621c3ff452fe8f64ded93"} Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778228 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerDied","Data":"e2f727ddbf338a127e5e006ab13bb62dbd7688201bbb2390868c2451cb3e511f"} Oct 08 21:02:31 crc kubenswrapper[4669]: I1008 21:02:31.778246 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerDied","Data":"7e95f94a5849d422de38d75ca69f81fa0e42ba57ce3b8e3939fbd1c6cac38176"} Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.279126 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441172 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-log-httpd\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441240 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-scripts\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441274 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkkgc\" (UniqueName: \"kubernetes.io/projected/493de98d-6819-413a-8f90-130f2f482cf5-kube-api-access-tkkgc\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441319 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-config-data\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441374 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-combined-ca-bundle\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441456 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-sg-core-conf-yaml\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.441501 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-run-httpd\") pod \"493de98d-6819-413a-8f90-130f2f482cf5\" (UID: \"493de98d-6819-413a-8f90-130f2f482cf5\") " Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.443276 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.443556 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.447591 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-scripts" (OuterVolumeSpecName: "scripts") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.449213 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493de98d-6819-413a-8f90-130f2f482cf5-kube-api-access-tkkgc" (OuterVolumeSpecName: "kube-api-access-tkkgc") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "kube-api-access-tkkgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.468567 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.516067 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.528975 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-config-data" (OuterVolumeSpecName: "config-data") pod "493de98d-6819-413a-8f90-130f2f482cf5" (UID: "493de98d-6819-413a-8f90-130f2f482cf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544768 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544815 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544828 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkkgc\" (UniqueName: \"kubernetes.io/projected/493de98d-6819-413a-8f90-130f2f482cf5-kube-api-access-tkkgc\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544842 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544853 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544864 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/493de98d-6819-413a-8f90-130f2f482cf5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.544876 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/493de98d-6819-413a-8f90-130f2f482cf5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.797069 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"493de98d-6819-413a-8f90-130f2f482cf5","Type":"ContainerDied","Data":"ea9adf949b02b86a4c24484db4293ac9178f158a9808c2746d57929d11b4b0a8"} Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.797147 4669 scope.go:117] "RemoveContainer" containerID="d97984db882c1dcedeb6276c4a7ee58301e7e15e74a659d56b99560a70196438" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.797154 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.848510 4669 scope.go:117] "RemoveContainer" containerID="bac96369e13e726f7096199645baab99639142b8467621c3ff452fe8f64ded93" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.866081 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.878138 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.881169 4669 scope.go:117] "RemoveContainer" containerID="e2f727ddbf338a127e5e006ab13bb62dbd7688201bbb2390868c2451cb3e511f" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.895743 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:32 crc kubenswrapper[4669]: E1008 21:02:32.896486 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="proxy-httpd" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.896680 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="proxy-httpd" Oct 08 21:02:32 crc kubenswrapper[4669]: E1008 21:02:32.896802 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="sg-core" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.896879 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="sg-core" Oct 08 21:02:32 crc kubenswrapper[4669]: E1008 21:02:32.896970 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-notification-agent" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.897058 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-notification-agent" Oct 08 21:02:32 crc kubenswrapper[4669]: E1008 21:02:32.897189 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-central-agent" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.897386 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-central-agent" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.897732 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-notification-agent" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.898866 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="sg-core" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.899061 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="ceilometer-central-agent" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.899243 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="493de98d-6819-413a-8f90-130f2f482cf5" containerName="proxy-httpd" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.902040 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.905141 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.907575 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.907894 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:02:32 crc kubenswrapper[4669]: I1008 21:02:32.911598 4669 scope.go:117] "RemoveContainer" containerID="7e95f94a5849d422de38d75ca69f81fa0e42ba57ce3b8e3939fbd1c6cac38176" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.055011 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-run-httpd\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.055082 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-log-httpd\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.055174 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2t9k\" (UniqueName: \"kubernetes.io/projected/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-kube-api-access-f2t9k\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.055983 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-scripts\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.056016 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.056040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.056057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-config-data\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158114 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2t9k\" (UniqueName: \"kubernetes.io/projected/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-kube-api-access-f2t9k\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158217 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-scripts\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158696 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158732 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158757 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-config-data\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158802 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-run-httpd\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.158862 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-log-httpd\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.159742 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-log-httpd\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.159762 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-run-httpd\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.163400 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.165004 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.173789 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-config-data\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.177856 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-scripts\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.183454 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2t9k\" (UniqueName: \"kubernetes.io/projected/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-kube-api-access-f2t9k\") pod \"ceilometer-0\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.237997 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.342859 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493de98d-6819-413a-8f90-130f2f482cf5" path="/var/lib/kubelet/pods/493de98d-6819-413a-8f90-130f2f482cf5/volumes" Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.674442 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:02:33 crc kubenswrapper[4669]: W1008 21:02:33.678038 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d0475f8_aca1_4fd8_9453_f5272ff2e52c.slice/crio-13d6f2b03f4a6ac3ef64b66ca6de644a341eb62ad71a3da89d2df2ccf1588ad6 WatchSource:0}: Error finding container 13d6f2b03f4a6ac3ef64b66ca6de644a341eb62ad71a3da89d2df2ccf1588ad6: Status 404 returned error can't find the container with id 13d6f2b03f4a6ac3ef64b66ca6de644a341eb62ad71a3da89d2df2ccf1588ad6 Oct 08 21:02:33 crc kubenswrapper[4669]: I1008 21:02:33.807644 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerStarted","Data":"13d6f2b03f4a6ac3ef64b66ca6de644a341eb62ad71a3da89d2df2ccf1588ad6"} Oct 08 21:02:34 crc kubenswrapper[4669]: I1008 21:02:34.819596 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerStarted","Data":"601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25"} Oct 08 21:02:35 crc kubenswrapper[4669]: I1008 21:02:35.829767 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerStarted","Data":"baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21"} Oct 08 21:02:36 crc kubenswrapper[4669]: I1008 21:02:36.847105 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerStarted","Data":"69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a"} Oct 08 21:02:37 crc kubenswrapper[4669]: I1008 21:02:37.871470 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerStarted","Data":"03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd"} Oct 08 21:02:37 crc kubenswrapper[4669]: I1008 21:02:37.871833 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:02:37 crc kubenswrapper[4669]: I1008 21:02:37.897580 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.148251536 podStartE2EDuration="5.897509823s" podCreationTimestamp="2025-10-08 21:02:32 +0000 UTC" firstStartedPulling="2025-10-08 21:02:33.680224085 +0000 UTC m=+1073.373034758" lastFinishedPulling="2025-10-08 21:02:37.429482362 +0000 UTC m=+1077.122293045" observedRunningTime="2025-10-08 21:02:37.89595468 +0000 UTC m=+1077.588765403" watchObservedRunningTime="2025-10-08 21:02:37.897509823 +0000 UTC m=+1077.590320506" Oct 08 21:02:39 crc kubenswrapper[4669]: E1008 21:02:39.716768 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b56cd8_5692_4a65_b8ad_1e39bf253846.slice/crio-c7d1cc9386ab3b92a4931c13ef8cae530e99ac0462fd182d311d2bf28a18131a\": RecentStats: unable to find data in memory cache]" Oct 08 21:02:39 crc kubenswrapper[4669]: I1008 21:02:39.897335 4669 generic.go:334] "Generic (PLEG): container finished" podID="1fe4f9a2-f7a7-433e-a6e9-4811d2424259" containerID="0b3ddbeb7c3054f12a71af868748cddd1dc97afa361e7047de8b21e5eab29011" exitCode=0 Oct 08 21:02:39 crc kubenswrapper[4669]: I1008 21:02:39.897378 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-28wfx" event={"ID":"1fe4f9a2-f7a7-433e-a6e9-4811d2424259","Type":"ContainerDied","Data":"0b3ddbeb7c3054f12a71af868748cddd1dc97afa361e7047de8b21e5eab29011"} Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.286418 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.438772 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-scripts\") pod \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.439355 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-config-data\") pod \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.439416 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz7mn\" (UniqueName: \"kubernetes.io/projected/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-kube-api-access-kz7mn\") pod \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.439452 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-combined-ca-bundle\") pod \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\" (UID: \"1fe4f9a2-f7a7-433e-a6e9-4811d2424259\") " Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.444489 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-kube-api-access-kz7mn" (OuterVolumeSpecName: "kube-api-access-kz7mn") pod "1fe4f9a2-f7a7-433e-a6e9-4811d2424259" (UID: "1fe4f9a2-f7a7-433e-a6e9-4811d2424259"). InnerVolumeSpecName "kube-api-access-kz7mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.446760 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-scripts" (OuterVolumeSpecName: "scripts") pod "1fe4f9a2-f7a7-433e-a6e9-4811d2424259" (UID: "1fe4f9a2-f7a7-433e-a6e9-4811d2424259"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.471289 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fe4f9a2-f7a7-433e-a6e9-4811d2424259" (UID: "1fe4f9a2-f7a7-433e-a6e9-4811d2424259"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.481685 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-config-data" (OuterVolumeSpecName: "config-data") pod "1fe4f9a2-f7a7-433e-a6e9-4811d2424259" (UID: "1fe4f9a2-f7a7-433e-a6e9-4811d2424259"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.541945 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.541973 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.541982 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz7mn\" (UniqueName: \"kubernetes.io/projected/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-kube-api-access-kz7mn\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.541993 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4f9a2-f7a7-433e-a6e9-4811d2424259-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.916066 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-28wfx" event={"ID":"1fe4f9a2-f7a7-433e-a6e9-4811d2424259","Type":"ContainerDied","Data":"ba0505a1b3602ded95af0febcfe88cefa696eeb7824cd230ba4da49d77e960f5"} Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.916327 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0505a1b3602ded95af0febcfe88cefa696eeb7824cd230ba4da49d77e960f5" Oct 08 21:02:41 crc kubenswrapper[4669]: I1008 21:02:41.916138 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-28wfx" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.008962 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 21:02:42 crc kubenswrapper[4669]: E1008 21:02:42.009426 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe4f9a2-f7a7-433e-a6e9-4811d2424259" containerName="nova-cell0-conductor-db-sync" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.009443 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4f9a2-f7a7-433e-a6e9-4811d2424259" containerName="nova-cell0-conductor-db-sync" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.009871 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe4f9a2-f7a7-433e-a6e9-4811d2424259" containerName="nova-cell0-conductor-db-sync" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.010833 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.012827 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.013155 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-swhg6" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.019040 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.185087 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hf25\" (UniqueName: \"kubernetes.io/projected/eb3e439c-741b-4d40-a85e-1f6da93a485c-kube-api-access-9hf25\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.185356 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3e439c-741b-4d40-a85e-1f6da93a485c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.185482 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3e439c-741b-4d40-a85e-1f6da93a485c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.287403 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hf25\" (UniqueName: \"kubernetes.io/projected/eb3e439c-741b-4d40-a85e-1f6da93a485c-kube-api-access-9hf25\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.287441 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3e439c-741b-4d40-a85e-1f6da93a485c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.287522 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3e439c-741b-4d40-a85e-1f6da93a485c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.291547 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3e439c-741b-4d40-a85e-1f6da93a485c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.293224 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3e439c-741b-4d40-a85e-1f6da93a485c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.321605 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hf25\" (UniqueName: \"kubernetes.io/projected/eb3e439c-741b-4d40-a85e-1f6da93a485c-kube-api-access-9hf25\") pod \"nova-cell0-conductor-0\" (UID: \"eb3e439c-741b-4d40-a85e-1f6da93a485c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.400516 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.827514 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 21:02:42 crc kubenswrapper[4669]: I1008 21:02:42.926640 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb3e439c-741b-4d40-a85e-1f6da93a485c","Type":"ContainerStarted","Data":"32a41a540e78ee3683c02f7fe2564ed89268827c117383bfadc5d7f0b77398a0"} Oct 08 21:02:43 crc kubenswrapper[4669]: I1008 21:02:43.185281 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:02:43 crc kubenswrapper[4669]: I1008 21:02:43.185660 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:02:43 crc kubenswrapper[4669]: I1008 21:02:43.937075 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb3e439c-741b-4d40-a85e-1f6da93a485c","Type":"ContainerStarted","Data":"985267ecec915a83317ed561d66fbe42d81d0301025e592700481f551232a3ab"} Oct 08 21:02:43 crc kubenswrapper[4669]: I1008 21:02:43.938121 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:43 crc kubenswrapper[4669]: I1008 21:02:43.957866 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.957850519 podStartE2EDuration="2.957850519s" podCreationTimestamp="2025-10-08 21:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:43.951040329 +0000 UTC m=+1083.643851002" watchObservedRunningTime="2025-10-08 21:02:43.957850519 +0000 UTC m=+1083.650661192" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.429342 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.891727 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x6p9j"] Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.893159 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.895564 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.901755 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.901997 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6p9j"] Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.993457 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87qf\" (UniqueName: \"kubernetes.io/projected/f51ff62e-2d14-4205-acb5-1ae440525941-kube-api-access-w87qf\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.996049 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-config-data\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.996267 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-scripts\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:52 crc kubenswrapper[4669]: I1008 21:02:52.996464 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.064793 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.074338 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.080842 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.086428 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.098510 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87qf\" (UniqueName: \"kubernetes.io/projected/f51ff62e-2d14-4205-acb5-1ae440525941-kube-api-access-w87qf\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.098593 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-config-data\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.098651 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-scripts\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.098716 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.115575 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-scripts\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.121222 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.128132 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-config-data\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.137468 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87qf\" (UniqueName: \"kubernetes.io/projected/f51ff62e-2d14-4205-acb5-1ae440525941-kube-api-access-w87qf\") pod \"nova-cell0-cell-mapping-x6p9j\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.201485 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-config-data\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.201615 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74fd913-c6c2-48c2-b283-e0fd296a24e8-logs\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.201671 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.201690 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/a74fd913-c6c2-48c2-b283-e0fd296a24e8-kube-api-access-sntwv\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.209604 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.211843 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.212265 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.216022 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.246858 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305119 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-config-data\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305162 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-config-data\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305245 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74fd913-c6c2-48c2-b283-e0fd296a24e8-logs\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305285 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305331 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/a74fd913-c6c2-48c2-b283-e0fd296a24e8-kube-api-access-sntwv\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.305358 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvc5x\" (UniqueName: \"kubernetes.io/projected/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-kube-api-access-zvc5x\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.312645 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74fd913-c6c2-48c2-b283-e0fd296a24e8-logs\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.326192 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-config-data\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.326259 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.327751 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.339766 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.356002 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.378284 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/a74fd913-c6c2-48c2-b283-e0fd296a24e8-kube-api-access-sntwv\") pod \"nova-api-0\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.380290 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.391767 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407683 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407739 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407787 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-config-data\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407807 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvc5x\" (UniqueName: \"kubernetes.io/projected/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-kube-api-access-zvc5x\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407847 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-config-data\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407876 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-logs\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.407891 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbpvw\" (UniqueName: \"kubernetes.io/projected/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-kube-api-access-bbpvw\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.417844 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-config-data\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.417904 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mvqwc"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.419366 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.446133 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mvqwc"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.465854 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.474653 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvc5x\" (UniqueName: \"kubernetes.io/projected/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-kube-api-access-zvc5x\") pod \"nova-scheduler-0\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.508265 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.509728 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.515466 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516218 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516270 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-config-data\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516295 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-config\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516318 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-svc\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516386 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-logs\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516408 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbpvw\" (UniqueName: \"kubernetes.io/projected/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-kube-api-access-bbpvw\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516426 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516459 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516484 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87kq6\" (UniqueName: \"kubernetes.io/projected/b0e2762f-41e9-48e0-aa45-0827f176c311-kube-api-access-87kq6\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.516566 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.517031 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.517782 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-logs\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.522691 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-config-data\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.532104 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.540065 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbpvw\" (UniqueName: \"kubernetes.io/projected/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-kube-api-access-bbpvw\") pod \"nova-metadata-0\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.618051 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjgx\" (UniqueName: \"kubernetes.io/projected/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-kube-api-access-dqjgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.618484 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.619523 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.618512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.619654 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.620673 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.620747 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87kq6\" (UniqueName: \"kubernetes.io/projected/b0e2762f-41e9-48e0-aa45-0827f176c311-kube-api-access-87kq6\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.620888 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.621284 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.621371 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-config\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.622278 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-config\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.622433 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.621416 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-svc\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.622635 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-svc\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.643472 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87kq6\" (UniqueName: \"kubernetes.io/projected/b0e2762f-41e9-48e0-aa45-0827f176c311-kube-api-access-87kq6\") pod \"dnsmasq-dns-865f5d856f-mvqwc\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.683023 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.722384 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.728478 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.728590 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjgx\" (UniqueName: \"kubernetes.io/projected/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-kube-api-access-dqjgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.728649 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.736233 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.738176 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.747920 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjgx\" (UniqueName: \"kubernetes.io/projected/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-kube-api-access-dqjgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.796440 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.831655 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.937252 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6p9j"] Oct 08 21:02:53 crc kubenswrapper[4669]: I1008 21:02:53.947707 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.060277 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6p9j" event={"ID":"f51ff62e-2d14-4205-acb5-1ae440525941","Type":"ContainerStarted","Data":"f7996610840aa0b628613c03f78817170619a965aee7f09f630a1fee0b638a90"} Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.061181 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a74fd913-c6c2-48c2-b283-e0fd296a24e8","Type":"ContainerStarted","Data":"c1c4f4750edabb9d53a12eaa647ccf11d97ce11860219680475e9a96d055bd38"} Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.171136 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-295kq"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.172506 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.175226 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.176504 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.182843 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-295kq"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.227643 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.236693 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf7wp\" (UniqueName: \"kubernetes.io/projected/38420235-f71d-4b0a-95ed-d4c86a23e44b-kube-api-access-lf7wp\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.236863 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.237016 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-config-data\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.237132 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-scripts\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.240174 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.338867 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-scripts\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.338995 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf7wp\" (UniqueName: \"kubernetes.io/projected/38420235-f71d-4b0a-95ed-d4c86a23e44b-kube-api-access-lf7wp\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.339038 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.339113 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-config-data\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.351418 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-scripts\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.351727 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.355283 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-config-data\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.363087 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf7wp\" (UniqueName: \"kubernetes.io/projected/38420235-f71d-4b0a-95ed-d4c86a23e44b-kube-api-access-lf7wp\") pod \"nova-cell1-conductor-db-sync-295kq\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.416469 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mvqwc"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.456927 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:02:54 crc kubenswrapper[4669]: I1008 21:02:54.624193 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.091613 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6p9j" event={"ID":"f51ff62e-2d14-4205-acb5-1ae440525941","Type":"ContainerStarted","Data":"83b49b92310d005ab3c938c263f2aa57b56a0ef10348756cd3d16e14bc67e1cc"} Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.094455 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a11691-df3b-4aeb-ad42-a8d4347eeb3f","Type":"ContainerStarted","Data":"1475137c06069be97af4207086e0e07eb65b0c869c256014427d9b48e1bf0806"} Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.098019 4669 generic.go:334] "Generic (PLEG): container finished" podID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerID="21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818" exitCode=0 Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.098112 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" event={"ID":"b0e2762f-41e9-48e0-aa45-0827f176c311","Type":"ContainerDied","Data":"21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818"} Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.098138 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" event={"ID":"b0e2762f-41e9-48e0-aa45-0827f176c311","Type":"ContainerStarted","Data":"79f80c4428b31e99cdedc2068a8bfa06dbe8550b7bde2eede660a1e3989a39d5"} Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.109928 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7","Type":"ContainerStarted","Data":"b38d7e9984ac41f25a680973333aab55220e5205ec5be26dd3b64fe06e998618"} Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.113546 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bb5d744-260a-49cc-89bd-bc34cceb4f2e","Type":"ContainerStarted","Data":"2380a04673b8dcdf5ace8c226785c46388868bf6e177a1d68c2d5cece3b12350"} Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.117629 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x6p9j" podStartSLOduration=3.117607397 podStartE2EDuration="3.117607397s" podCreationTimestamp="2025-10-08 21:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:55.103890596 +0000 UTC m=+1094.796701269" watchObservedRunningTime="2025-10-08 21:02:55.117607397 +0000 UTC m=+1094.810418060" Oct 08 21:02:55 crc kubenswrapper[4669]: W1008 21:02:55.183650 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38420235_f71d_4b0a_95ed_d4c86a23e44b.slice/crio-c3190c963142a0afded6c8110f08a1361005fbcf0ad140e267aafd31c10168b2 WatchSource:0}: Error finding container c3190c963142a0afded6c8110f08a1361005fbcf0ad140e267aafd31c10168b2: Status 404 returned error can't find the container with id c3190c963142a0afded6c8110f08a1361005fbcf0ad140e267aafd31c10168b2 Oct 08 21:02:55 crc kubenswrapper[4669]: I1008 21:02:55.239767 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-295kq"] Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.142766 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-295kq" event={"ID":"38420235-f71d-4b0a-95ed-d4c86a23e44b","Type":"ContainerStarted","Data":"85829c24cdb7bf9fb518f37474e69d1fd3d00723a5bbeb1c7426c0e30629c33d"} Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.144095 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-295kq" event={"ID":"38420235-f71d-4b0a-95ed-d4c86a23e44b","Type":"ContainerStarted","Data":"c3190c963142a0afded6c8110f08a1361005fbcf0ad140e267aafd31c10168b2"} Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.148753 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" event={"ID":"b0e2762f-41e9-48e0-aa45-0827f176c311","Type":"ContainerStarted","Data":"f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75"} Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.149160 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.166074 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-295kq" podStartSLOduration=2.166056029 podStartE2EDuration="2.166056029s" podCreationTimestamp="2025-10-08 21:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:56.159548278 +0000 UTC m=+1095.852358961" watchObservedRunningTime="2025-10-08 21:02:56.166056029 +0000 UTC m=+1095.858866702" Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.179895 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" podStartSLOduration=3.179880512 podStartE2EDuration="3.179880512s" podCreationTimestamp="2025-10-08 21:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:02:56.178101893 +0000 UTC m=+1095.870912566" watchObservedRunningTime="2025-10-08 21:02:56.179880512 +0000 UTC m=+1095.872691185" Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.621580 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:02:56 crc kubenswrapper[4669]: I1008 21:02:56.629612 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.172296 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a11691-df3b-4aeb-ad42-a8d4347eeb3f","Type":"ContainerStarted","Data":"a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f"} Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.172417 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="35a11691-df3b-4aeb-ad42-a8d4347eeb3f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f" gracePeriod=30 Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.175504 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a74fd913-c6c2-48c2-b283-e0fd296a24e8","Type":"ContainerStarted","Data":"2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d"} Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.183160 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7","Type":"ContainerStarted","Data":"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38"} Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.186023 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bb5d744-260a-49cc-89bd-bc34cceb4f2e","Type":"ContainerStarted","Data":"ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744"} Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.195192 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.957285328 podStartE2EDuration="5.195171801s" podCreationTimestamp="2025-10-08 21:02:53 +0000 UTC" firstStartedPulling="2025-10-08 21:02:54.460355264 +0000 UTC m=+1094.153165937" lastFinishedPulling="2025-10-08 21:02:57.698241737 +0000 UTC m=+1097.391052410" observedRunningTime="2025-10-08 21:02:58.189913775 +0000 UTC m=+1097.882724468" watchObservedRunningTime="2025-10-08 21:02:58.195171801 +0000 UTC m=+1097.887982474" Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.210388 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.742570686 podStartE2EDuration="5.210370642s" podCreationTimestamp="2025-10-08 21:02:53 +0000 UTC" firstStartedPulling="2025-10-08 21:02:54.228678893 +0000 UTC m=+1093.921489566" lastFinishedPulling="2025-10-08 21:02:57.696478849 +0000 UTC m=+1097.389289522" observedRunningTime="2025-10-08 21:02:58.207912564 +0000 UTC m=+1097.900723237" watchObservedRunningTime="2025-10-08 21:02:58.210370642 +0000 UTC m=+1097.903181315" Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.684084 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 21:02:58 crc kubenswrapper[4669]: I1008 21:02:58.832467 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.199661 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7","Type":"ContainerStarted","Data":"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d"} Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.199908 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-metadata" containerID="cri-o://cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d" gracePeriod=30 Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.199927 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-log" containerID="cri-o://37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38" gracePeriod=30 Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.213140 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a74fd913-c6c2-48c2-b283-e0fd296a24e8","Type":"ContainerStarted","Data":"23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca"} Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.233370 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.748707594 podStartE2EDuration="6.233343117s" podCreationTimestamp="2025-10-08 21:02:53 +0000 UTC" firstStartedPulling="2025-10-08 21:02:54.221109623 +0000 UTC m=+1093.913920296" lastFinishedPulling="2025-10-08 21:02:57.705745146 +0000 UTC m=+1097.398555819" observedRunningTime="2025-10-08 21:02:59.230311943 +0000 UTC m=+1098.923122626" watchObservedRunningTime="2025-10-08 21:02:59.233343117 +0000 UTC m=+1098.926153830" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.254617 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.515013208 podStartE2EDuration="6.254593497s" podCreationTimestamp="2025-10-08 21:02:53 +0000 UTC" firstStartedPulling="2025-10-08 21:02:53.95763105 +0000 UTC m=+1093.650441723" lastFinishedPulling="2025-10-08 21:02:57.697211339 +0000 UTC m=+1097.390022012" observedRunningTime="2025-10-08 21:02:59.248119867 +0000 UTC m=+1098.940930540" watchObservedRunningTime="2025-10-08 21:02:59.254593497 +0000 UTC m=+1098.947404190" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.819372 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.856901 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbpvw\" (UniqueName: \"kubernetes.io/projected/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-kube-api-access-bbpvw\") pod \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.856994 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-config-data\") pod \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.857024 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-logs\") pod \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.857116 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-combined-ca-bundle\") pod \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\" (UID: \"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7\") " Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.857942 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-logs" (OuterVolumeSpecName: "logs") pod "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" (UID: "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.870955 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-kube-api-access-bbpvw" (OuterVolumeSpecName: "kube-api-access-bbpvw") pod "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" (UID: "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7"). InnerVolumeSpecName "kube-api-access-bbpvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.891709 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" (UID: "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.907855 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-config-data" (OuterVolumeSpecName: "config-data") pod "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" (UID: "39ed1e2f-a51f-45ee-b28f-a80242b7dbc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.958781 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbpvw\" (UniqueName: \"kubernetes.io/projected/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-kube-api-access-bbpvw\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.958811 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.958823 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:02:59 crc kubenswrapper[4669]: I1008 21:02:59.958834 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226016 4669 generic.go:334] "Generic (PLEG): container finished" podID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerID="cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d" exitCode=0 Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226056 4669 generic.go:334] "Generic (PLEG): container finished" podID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerID="37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38" exitCode=143 Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226078 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226220 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7","Type":"ContainerDied","Data":"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d"} Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226298 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7","Type":"ContainerDied","Data":"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38"} Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226312 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39ed1e2f-a51f-45ee-b28f-a80242b7dbc7","Type":"ContainerDied","Data":"b38d7e9984ac41f25a680973333aab55220e5205ec5be26dd3b64fe06e998618"} Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.226695 4669 scope.go:117] "RemoveContainer" containerID="cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.273991 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.276814 4669 scope.go:117] "RemoveContainer" containerID="37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.284813 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.295588 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:00 crc kubenswrapper[4669]: E1008 21:03:00.296053 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-metadata" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.296070 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-metadata" Oct 08 21:03:00 crc kubenswrapper[4669]: E1008 21:03:00.296079 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-log" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.296085 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-log" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.296248 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-metadata" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.296267 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" containerName="nova-metadata-log" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.297946 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.304333 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.304492 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.306373 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.328711 4669 scope.go:117] "RemoveContainer" containerID="cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d" Oct 08 21:03:00 crc kubenswrapper[4669]: E1008 21:03:00.329241 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d\": container with ID starting with cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d not found: ID does not exist" containerID="cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.329275 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d"} err="failed to get container status \"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d\": rpc error: code = NotFound desc = could not find container \"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d\": container with ID starting with cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d not found: ID does not exist" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.329301 4669 scope.go:117] "RemoveContainer" containerID="37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38" Oct 08 21:03:00 crc kubenswrapper[4669]: E1008 21:03:00.336777 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38\": container with ID starting with 37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38 not found: ID does not exist" containerID="37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.336822 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38"} err="failed to get container status \"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38\": rpc error: code = NotFound desc = could not find container \"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38\": container with ID starting with 37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38 not found: ID does not exist" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.336855 4669 scope.go:117] "RemoveContainer" containerID="cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.337198 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d"} err="failed to get container status \"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d\": rpc error: code = NotFound desc = could not find container \"cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d\": container with ID starting with cd7d9be57cbf6aefa1dd5987cfda2032e3e6453a0b1de085ce8f810118aacc8d not found: ID does not exist" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.337219 4669 scope.go:117] "RemoveContainer" containerID="37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.337618 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38"} err="failed to get container status \"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38\": rpc error: code = NotFound desc = could not find container \"37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38\": container with ID starting with 37a976d1b998c34d333fe67f27188e30204e0b7a534c6ec98b61f3a955f0db38 not found: ID does not exist" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.366994 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwq96\" (UniqueName: \"kubernetes.io/projected/2cba1dea-134c-4945-bb7c-e2e7131410e1-kube-api-access-mwq96\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.367210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-config-data\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.367258 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.367675 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.367799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cba1dea-134c-4945-bb7c-e2e7131410e1-logs\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.469136 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-config-data\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.469593 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.469736 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.469816 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cba1dea-134c-4945-bb7c-e2e7131410e1-logs\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.469872 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwq96\" (UniqueName: \"kubernetes.io/projected/2cba1dea-134c-4945-bb7c-e2e7131410e1-kube-api-access-mwq96\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.470286 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cba1dea-134c-4945-bb7c-e2e7131410e1-logs\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.476337 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.476469 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-config-data\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.489500 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.491152 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwq96\" (UniqueName: \"kubernetes.io/projected/2cba1dea-134c-4945-bb7c-e2e7131410e1-kube-api-access-mwq96\") pod \"nova-metadata-0\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " pod="openstack/nova-metadata-0" Oct 08 21:03:00 crc kubenswrapper[4669]: I1008 21:03:00.612173 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:01 crc kubenswrapper[4669]: I1008 21:03:01.082982 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:01 crc kubenswrapper[4669]: I1008 21:03:01.256822 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cba1dea-134c-4945-bb7c-e2e7131410e1","Type":"ContainerStarted","Data":"a58e6ec54d727674cbc5c16224b60fb45ee1d20b473493a8307c46cffbe37073"} Oct 08 21:03:01 crc kubenswrapper[4669]: I1008 21:03:01.346725 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ed1e2f-a51f-45ee-b28f-a80242b7dbc7" path="/var/lib/kubelet/pods/39ed1e2f-a51f-45ee-b28f-a80242b7dbc7/volumes" Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.273330 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cba1dea-134c-4945-bb7c-e2e7131410e1","Type":"ContainerStarted","Data":"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497"} Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.273705 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cba1dea-134c-4945-bb7c-e2e7131410e1","Type":"ContainerStarted","Data":"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99"} Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.275571 4669 generic.go:334] "Generic (PLEG): container finished" podID="f51ff62e-2d14-4205-acb5-1ae440525941" containerID="83b49b92310d005ab3c938c263f2aa57b56a0ef10348756cd3d16e14bc67e1cc" exitCode=0 Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.275611 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6p9j" event={"ID":"f51ff62e-2d14-4205-acb5-1ae440525941","Type":"ContainerDied","Data":"83b49b92310d005ab3c938c263f2aa57b56a0ef10348756cd3d16e14bc67e1cc"} Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.278814 4669 generic.go:334] "Generic (PLEG): container finished" podID="38420235-f71d-4b0a-95ed-d4c86a23e44b" containerID="85829c24cdb7bf9fb518f37474e69d1fd3d00723a5bbeb1c7426c0e30629c33d" exitCode=0 Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.278915 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-295kq" event={"ID":"38420235-f71d-4b0a-95ed-d4c86a23e44b","Type":"ContainerDied","Data":"85829c24cdb7bf9fb518f37474e69d1fd3d00723a5bbeb1c7426c0e30629c33d"} Oct 08 21:03:02 crc kubenswrapper[4669]: I1008 21:03:02.306312 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.306280502 podStartE2EDuration="2.306280502s" podCreationTimestamp="2025-10-08 21:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:02.300137541 +0000 UTC m=+1101.992948284" watchObservedRunningTime="2025-10-08 21:03:02.306280502 +0000 UTC m=+1101.999091205" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.248651 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.393202 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.393252 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.683985 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.710990 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.783010 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.789069 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.802746 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.837521 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf7wp\" (UniqueName: \"kubernetes.io/projected/38420235-f71d-4b0a-95ed-d4c86a23e44b-kube-api-access-lf7wp\") pod \"38420235-f71d-4b0a-95ed-d4c86a23e44b\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.837816 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w87qf\" (UniqueName: \"kubernetes.io/projected/f51ff62e-2d14-4205-acb5-1ae440525941-kube-api-access-w87qf\") pod \"f51ff62e-2d14-4205-acb5-1ae440525941\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.837929 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-combined-ca-bundle\") pod \"38420235-f71d-4b0a-95ed-d4c86a23e44b\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.838057 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-combined-ca-bundle\") pod \"f51ff62e-2d14-4205-acb5-1ae440525941\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.838169 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-scripts\") pod \"f51ff62e-2d14-4205-acb5-1ae440525941\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.838309 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-config-data\") pod \"f51ff62e-2d14-4205-acb5-1ae440525941\" (UID: \"f51ff62e-2d14-4205-acb5-1ae440525941\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.838628 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-config-data\") pod \"38420235-f71d-4b0a-95ed-d4c86a23e44b\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.838758 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-scripts\") pod \"38420235-f71d-4b0a-95ed-d4c86a23e44b\" (UID: \"38420235-f71d-4b0a-95ed-d4c86a23e44b\") " Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.851073 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f51ff62e-2d14-4205-acb5-1ae440525941-kube-api-access-w87qf" (OuterVolumeSpecName: "kube-api-access-w87qf") pod "f51ff62e-2d14-4205-acb5-1ae440525941" (UID: "f51ff62e-2d14-4205-acb5-1ae440525941"). InnerVolumeSpecName "kube-api-access-w87qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.857405 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-scripts" (OuterVolumeSpecName: "scripts") pod "f51ff62e-2d14-4205-acb5-1ae440525941" (UID: "f51ff62e-2d14-4205-acb5-1ae440525941"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.858161 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-scripts" (OuterVolumeSpecName: "scripts") pod "38420235-f71d-4b0a-95ed-d4c86a23e44b" (UID: "38420235-f71d-4b0a-95ed-d4c86a23e44b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.859756 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38420235-f71d-4b0a-95ed-d4c86a23e44b-kube-api-access-lf7wp" (OuterVolumeSpecName: "kube-api-access-lf7wp") pod "38420235-f71d-4b0a-95ed-d4c86a23e44b" (UID: "38420235-f71d-4b0a-95ed-d4c86a23e44b"). InnerVolumeSpecName "kube-api-access-lf7wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.892522 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-config-data" (OuterVolumeSpecName: "config-data") pod "f51ff62e-2d14-4205-acb5-1ae440525941" (UID: "f51ff62e-2d14-4205-acb5-1ae440525941"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.927791 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f51ff62e-2d14-4205-acb5-1ae440525941" (UID: "f51ff62e-2d14-4205-acb5-1ae440525941"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.938125 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7hsvp"] Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.938415 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerName="dnsmasq-dns" containerID="cri-o://5e460d76360b08e655955b0b3b171efbb586d7b0976d20a1d19f7d1da176bd68" gracePeriod=10 Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.941873 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf7wp\" (UniqueName: \"kubernetes.io/projected/38420235-f71d-4b0a-95ed-d4c86a23e44b-kube-api-access-lf7wp\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.941890 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w87qf\" (UniqueName: \"kubernetes.io/projected/f51ff62e-2d14-4205-acb5-1ae440525941-kube-api-access-w87qf\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.941905 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.941914 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.941922 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ff62e-2d14-4205-acb5-1ae440525941-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.941931 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.948066 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-config-data" (OuterVolumeSpecName: "config-data") pod "38420235-f71d-4b0a-95ed-d4c86a23e44b" (UID: "38420235-f71d-4b0a-95ed-d4c86a23e44b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:03 crc kubenswrapper[4669]: I1008 21:03:03.950612 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38420235-f71d-4b0a-95ed-d4c86a23e44b" (UID: "38420235-f71d-4b0a-95ed-d4c86a23e44b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.055055 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.055304 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38420235-f71d-4b0a-95ed-d4c86a23e44b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.312568 4669 generic.go:334] "Generic (PLEG): container finished" podID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerID="5e460d76360b08e655955b0b3b171efbb586d7b0976d20a1d19f7d1da176bd68" exitCode=0 Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.312624 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" event={"ID":"f1293b5c-61ce-47f1-a06c-d794994c81f7","Type":"ContainerDied","Data":"5e460d76360b08e655955b0b3b171efbb586d7b0976d20a1d19f7d1da176bd68"} Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.317398 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6p9j" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.317661 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6p9j" event={"ID":"f51ff62e-2d14-4205-acb5-1ae440525941","Type":"ContainerDied","Data":"f7996610840aa0b628613c03f78817170619a965aee7f09f630a1fee0b638a90"} Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.317765 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7996610840aa0b628613c03f78817170619a965aee7f09f630a1fee0b638a90" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.321547 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-295kq" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.322251 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-295kq" event={"ID":"38420235-f71d-4b0a-95ed-d4c86a23e44b","Type":"ContainerDied","Data":"c3190c963142a0afded6c8110f08a1361005fbcf0ad140e267aafd31c10168b2"} Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.322291 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3190c963142a0afded6c8110f08a1361005fbcf0ad140e267aafd31c10168b2" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.366879 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.431844 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 21:03:04 crc kubenswrapper[4669]: E1008 21:03:04.432612 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38420235-f71d-4b0a-95ed-d4c86a23e44b" containerName="nova-cell1-conductor-db-sync" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.432631 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="38420235-f71d-4b0a-95ed-d4c86a23e44b" containerName="nova-cell1-conductor-db-sync" Oct 08 21:03:04 crc kubenswrapper[4669]: E1008 21:03:04.432652 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51ff62e-2d14-4205-acb5-1ae440525941" containerName="nova-manage" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.432660 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51ff62e-2d14-4205-acb5-1ae440525941" containerName="nova-manage" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.432868 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="38420235-f71d-4b0a-95ed-d4c86a23e44b" containerName="nova-cell1-conductor-db-sync" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.432890 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51ff62e-2d14-4205-acb5-1ae440525941" containerName="nova-manage" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.433557 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.436170 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.442114 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.463792 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.476076 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.476388 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.494976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4f1ade-7c5a-45f6-8b72-33a192186209-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.495108 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4f1ade-7c5a-45f6-8b72-33a192186209-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.495156 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vghz\" (UniqueName: \"kubernetes.io/projected/7c4f1ade-7c5a-45f6-8b72-33a192186209-kube-api-access-4vghz\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.596800 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-sb\") pod \"f1293b5c-61ce-47f1-a06c-d794994c81f7\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.596978 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-swift-storage-0\") pod \"f1293b5c-61ce-47f1-a06c-d794994c81f7\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.597020 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-nb\") pod \"f1293b5c-61ce-47f1-a06c-d794994c81f7\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.597052 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-svc\") pod \"f1293b5c-61ce-47f1-a06c-d794994c81f7\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.597091 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-config\") pod \"f1293b5c-61ce-47f1-a06c-d794994c81f7\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.597173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prxhj\" (UniqueName: \"kubernetes.io/projected/f1293b5c-61ce-47f1-a06c-d794994c81f7-kube-api-access-prxhj\") pod \"f1293b5c-61ce-47f1-a06c-d794994c81f7\" (UID: \"f1293b5c-61ce-47f1-a06c-d794994c81f7\") " Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.598047 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4f1ade-7c5a-45f6-8b72-33a192186209-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.598160 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vghz\" (UniqueName: \"kubernetes.io/projected/7c4f1ade-7c5a-45f6-8b72-33a192186209-kube-api-access-4vghz\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.598373 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4f1ade-7c5a-45f6-8b72-33a192186209-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.607232 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4f1ade-7c5a-45f6-8b72-33a192186209-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.608669 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1293b5c-61ce-47f1-a06c-d794994c81f7-kube-api-access-prxhj" (OuterVolumeSpecName: "kube-api-access-prxhj") pod "f1293b5c-61ce-47f1-a06c-d794994c81f7" (UID: "f1293b5c-61ce-47f1-a06c-d794994c81f7"). InnerVolumeSpecName "kube-api-access-prxhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.611361 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4f1ade-7c5a-45f6-8b72-33a192186209-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.616784 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.617195 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-log" containerID="cri-o://2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d" gracePeriod=30 Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.617463 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-api" containerID="cri-o://23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca" gracePeriod=30 Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.640503 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.641162 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-log" containerID="cri-o://b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99" gracePeriod=30 Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.641242 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-metadata" containerID="cri-o://042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497" gracePeriod=30 Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.647136 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vghz\" (UniqueName: \"kubernetes.io/projected/7c4f1ade-7c5a-45f6-8b72-33a192186209-kube-api-access-4vghz\") pod \"nova-cell1-conductor-0\" (UID: \"7c4f1ade-7c5a-45f6-8b72-33a192186209\") " pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.676085 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1293b5c-61ce-47f1-a06c-d794994c81f7" (UID: "f1293b5c-61ce-47f1-a06c-d794994c81f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.686454 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1293b5c-61ce-47f1-a06c-d794994c81f7" (UID: "f1293b5c-61ce-47f1-a06c-d794994c81f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.700837 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prxhj\" (UniqueName: \"kubernetes.io/projected/f1293b5c-61ce-47f1-a06c-d794994c81f7-kube-api-access-prxhj\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.700874 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.700886 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.717291 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-config" (OuterVolumeSpecName: "config") pod "f1293b5c-61ce-47f1-a06c-d794994c81f7" (UID: "f1293b5c-61ce-47f1-a06c-d794994c81f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.723227 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1293b5c-61ce-47f1-a06c-d794994c81f7" (UID: "f1293b5c-61ce-47f1-a06c-d794994c81f7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.725164 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1293b5c-61ce-47f1-a06c-d794994c81f7" (UID: "f1293b5c-61ce-47f1-a06c-d794994c81f7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.760964 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.802488 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.802659 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.802732 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1293b5c-61ce-47f1-a06c-d794994c81f7-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:04 crc kubenswrapper[4669]: I1008 21:03:04.938121 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.252925 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.268416 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.333449 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.345921 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7c4f1ade-7c5a-45f6-8b72-33a192186209","Type":"ContainerStarted","Data":"0417197182a398f2b1b22c543ff97d6f33a496510df885718fc6df2be4ec4111"} Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.345963 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7hsvp" event={"ID":"f1293b5c-61ce-47f1-a06c-d794994c81f7","Type":"ContainerDied","Data":"de8a8907bb615f51390d53286a104a9a25d66bce0f1928c94ba3efffd7eab244"} Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.345986 4669 scope.go:117] "RemoveContainer" containerID="5e460d76360b08e655955b0b3b171efbb586d7b0976d20a1d19f7d1da176bd68" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.347768 4669 generic.go:334] "Generic (PLEG): container finished" podID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerID="2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d" exitCode=143 Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.347815 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a74fd913-c6c2-48c2-b283-e0fd296a24e8","Type":"ContainerDied","Data":"2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d"} Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.358865 4669 generic.go:334] "Generic (PLEG): container finished" podID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerID="042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497" exitCode=0 Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.359025 4669 generic.go:334] "Generic (PLEG): container finished" podID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerID="b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99" exitCode=143 Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.359741 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.359896 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cba1dea-134c-4945-bb7c-e2e7131410e1","Type":"ContainerDied","Data":"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497"} Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.359921 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cba1dea-134c-4945-bb7c-e2e7131410e1","Type":"ContainerDied","Data":"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99"} Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.359931 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cba1dea-134c-4945-bb7c-e2e7131410e1","Type":"ContainerDied","Data":"a58e6ec54d727674cbc5c16224b60fb45ee1d20b473493a8307c46cffbe37073"} Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.378645 4669 scope.go:117] "RemoveContainer" containerID="0b46ad63619da8550faed69c810af7109139e2aa004b65527888e2d323cdfa71" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.379362 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7hsvp"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.388011 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7hsvp"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.416924 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-combined-ca-bundle\") pod \"2cba1dea-134c-4945-bb7c-e2e7131410e1\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.417023 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwq96\" (UniqueName: \"kubernetes.io/projected/2cba1dea-134c-4945-bb7c-e2e7131410e1-kube-api-access-mwq96\") pod \"2cba1dea-134c-4945-bb7c-e2e7131410e1\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.417062 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-config-data\") pod \"2cba1dea-134c-4945-bb7c-e2e7131410e1\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.417106 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cba1dea-134c-4945-bb7c-e2e7131410e1-logs\") pod \"2cba1dea-134c-4945-bb7c-e2e7131410e1\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.417156 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-nova-metadata-tls-certs\") pod \"2cba1dea-134c-4945-bb7c-e2e7131410e1\" (UID: \"2cba1dea-134c-4945-bb7c-e2e7131410e1\") " Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.418473 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cba1dea-134c-4945-bb7c-e2e7131410e1-logs" (OuterVolumeSpecName: "logs") pod "2cba1dea-134c-4945-bb7c-e2e7131410e1" (UID: "2cba1dea-134c-4945-bb7c-e2e7131410e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.429023 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cba1dea-134c-4945-bb7c-e2e7131410e1-kube-api-access-mwq96" (OuterVolumeSpecName: "kube-api-access-mwq96") pod "2cba1dea-134c-4945-bb7c-e2e7131410e1" (UID: "2cba1dea-134c-4945-bb7c-e2e7131410e1"). InnerVolumeSpecName "kube-api-access-mwq96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.436845 4669 scope.go:117] "RemoveContainer" containerID="042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.454163 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-config-data" (OuterVolumeSpecName: "config-data") pod "2cba1dea-134c-4945-bb7c-e2e7131410e1" (UID: "2cba1dea-134c-4945-bb7c-e2e7131410e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.466248 4669 scope.go:117] "RemoveContainer" containerID="b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.479721 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cba1dea-134c-4945-bb7c-e2e7131410e1" (UID: "2cba1dea-134c-4945-bb7c-e2e7131410e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.494759 4669 scope.go:117] "RemoveContainer" containerID="042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497" Oct 08 21:03:05 crc kubenswrapper[4669]: E1008 21:03:05.496100 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497\": container with ID starting with 042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497 not found: ID does not exist" containerID="042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.496132 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497"} err="failed to get container status \"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497\": rpc error: code = NotFound desc = could not find container \"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497\": container with ID starting with 042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497 not found: ID does not exist" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.496346 4669 scope.go:117] "RemoveContainer" containerID="b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99" Oct 08 21:03:05 crc kubenswrapper[4669]: E1008 21:03:05.499017 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99\": container with ID starting with b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99 not found: ID does not exist" containerID="b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.499042 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99"} err="failed to get container status \"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99\": rpc error: code = NotFound desc = could not find container \"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99\": container with ID starting with b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99 not found: ID does not exist" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.499057 4669 scope.go:117] "RemoveContainer" containerID="042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.499756 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497"} err="failed to get container status \"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497\": rpc error: code = NotFound desc = could not find container \"042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497\": container with ID starting with 042e0cf7399639fccc898d11f91552b71721d1285248714a991cb50d778df497 not found: ID does not exist" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.499805 4669 scope.go:117] "RemoveContainer" containerID="b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.500096 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99"} err="failed to get container status \"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99\": rpc error: code = NotFound desc = could not find container \"b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99\": container with ID starting with b833780e0886bc55f2ace70a4f0e80090e512a364b520986847aff805649df99 not found: ID does not exist" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.501714 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2cba1dea-134c-4945-bb7c-e2e7131410e1" (UID: "2cba1dea-134c-4945-bb7c-e2e7131410e1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.520421 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.520461 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwq96\" (UniqueName: \"kubernetes.io/projected/2cba1dea-134c-4945-bb7c-e2e7131410e1-kube-api-access-mwq96\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.520477 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.520489 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cba1dea-134c-4945-bb7c-e2e7131410e1-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.520501 4669 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cba1dea-134c-4945-bb7c-e2e7131410e1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.718166 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.736339 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.748107 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:05 crc kubenswrapper[4669]: E1008 21:03:05.748676 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-log" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.748702 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-log" Oct 08 21:03:05 crc kubenswrapper[4669]: E1008 21:03:05.748724 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-metadata" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.748735 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-metadata" Oct 08 21:03:05 crc kubenswrapper[4669]: E1008 21:03:05.748748 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerName="init" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.748759 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerName="init" Oct 08 21:03:05 crc kubenswrapper[4669]: E1008 21:03:05.748784 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerName="dnsmasq-dns" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.748793 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerName="dnsmasq-dns" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.749020 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-metadata" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.749058 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" containerName="dnsmasq-dns" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.749083 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" containerName="nova-metadata-log" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.750356 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.752189 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.760413 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.761169 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.825080 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4838748-5f6b-4ca4-a828-150c8213ce8e-logs\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.825398 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lccd5\" (UniqueName: \"kubernetes.io/projected/b4838748-5f6b-4ca4-a828-150c8213ce8e-kube-api-access-lccd5\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.825541 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.825680 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-config-data\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.825780 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.927321 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4838748-5f6b-4ca4-a828-150c8213ce8e-logs\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.927388 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lccd5\" (UniqueName: \"kubernetes.io/projected/b4838748-5f6b-4ca4-a828-150c8213ce8e-kube-api-access-lccd5\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.927420 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.927454 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-config-data\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.927475 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.927845 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4838748-5f6b-4ca4-a828-150c8213ce8e-logs\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.931182 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.933118 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.933433 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-config-data\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:05 crc kubenswrapper[4669]: I1008 21:03:05.952655 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lccd5\" (UniqueName: \"kubernetes.io/projected/b4838748-5f6b-4ca4-a828-150c8213ce8e-kube-api-access-lccd5\") pod \"nova-metadata-0\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " pod="openstack/nova-metadata-0" Oct 08 21:03:06 crc kubenswrapper[4669]: I1008 21:03:06.082146 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:06 crc kubenswrapper[4669]: I1008 21:03:06.367203 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7c4f1ade-7c5a-45f6-8b72-33a192186209","Type":"ContainerStarted","Data":"6bcadb0301d78778ed3ca841ecd12af0851b812315d3e0034c791f07112c7d52"} Oct 08 21:03:06 crc kubenswrapper[4669]: I1008 21:03:06.367623 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:06 crc kubenswrapper[4669]: I1008 21:03:06.380871 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" containerName="nova-scheduler-scheduler" containerID="cri-o://ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" gracePeriod=30 Oct 08 21:03:06 crc kubenswrapper[4669]: I1008 21:03:06.400155 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.400136678 podStartE2EDuration="2.400136678s" podCreationTimestamp="2025-10-08 21:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:06.384930322 +0000 UTC m=+1106.077740995" watchObservedRunningTime="2025-10-08 21:03:06.400136678 +0000 UTC m=+1106.092947351" Oct 08 21:03:06 crc kubenswrapper[4669]: I1008 21:03:06.530542 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.353195 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cba1dea-134c-4945-bb7c-e2e7131410e1" path="/var/lib/kubelet/pods/2cba1dea-134c-4945-bb7c-e2e7131410e1/volumes" Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.354659 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1293b5c-61ce-47f1-a06c-d794994c81f7" path="/var/lib/kubelet/pods/f1293b5c-61ce-47f1-a06c-d794994c81f7/volumes" Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.391382 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4838748-5f6b-4ca4-a828-150c8213ce8e","Type":"ContainerStarted","Data":"8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce"} Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.391657 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4838748-5f6b-4ca4-a828-150c8213ce8e","Type":"ContainerStarted","Data":"307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a"} Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.391748 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4838748-5f6b-4ca4-a828-150c8213ce8e","Type":"ContainerStarted","Data":"0c5f35b0706309ac6ac71f67cc6e94012eedef7b880accb9a0a36d1d2d4210b1"} Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.414815 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.414799226 podStartE2EDuration="2.414799226s" podCreationTimestamp="2025-10-08 21:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:07.409059729 +0000 UTC m=+1107.101870412" watchObservedRunningTime="2025-10-08 21:03:07.414799226 +0000 UTC m=+1107.107609899" Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.549555 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.549763 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" containerName="kube-state-metrics" containerID="cri-o://ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0" gracePeriod=30 Oct 08 21:03:07 crc kubenswrapper[4669]: I1008 21:03:07.991550 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.068272 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krzg5\" (UniqueName: \"kubernetes.io/projected/1e47bce4-587b-4864-86ea-a1e2a7987779-kube-api-access-krzg5\") pod \"1e47bce4-587b-4864-86ea-a1e2a7987779\" (UID: \"1e47bce4-587b-4864-86ea-a1e2a7987779\") " Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.074320 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e47bce4-587b-4864-86ea-a1e2a7987779-kube-api-access-krzg5" (OuterVolumeSpecName: "kube-api-access-krzg5") pod "1e47bce4-587b-4864-86ea-a1e2a7987779" (UID: "1e47bce4-587b-4864-86ea-a1e2a7987779"). InnerVolumeSpecName "kube-api-access-krzg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.170439 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krzg5\" (UniqueName: \"kubernetes.io/projected/1e47bce4-587b-4864-86ea-a1e2a7987779-kube-api-access-krzg5\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.413938 4669 generic.go:334] "Generic (PLEG): container finished" podID="1e47bce4-587b-4864-86ea-a1e2a7987779" containerID="ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0" exitCode=2 Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.414923 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.418089 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e47bce4-587b-4864-86ea-a1e2a7987779","Type":"ContainerDied","Data":"ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0"} Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.418280 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e47bce4-587b-4864-86ea-a1e2a7987779","Type":"ContainerDied","Data":"48ddd5abea97bcb5f4d2c16fdb1a4d9b716a85e6a003098401c763ffd93c92d7"} Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.418372 4669 scope.go:117] "RemoveContainer" containerID="ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.453839 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.464581 4669 scope.go:117] "RemoveContainer" containerID="ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0" Oct 08 21:03:08 crc kubenswrapper[4669]: E1008 21:03:08.465220 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0\": container with ID starting with ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0 not found: ID does not exist" containerID="ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.465271 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0"} err="failed to get container status \"ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0\": rpc error: code = NotFound desc = could not find container \"ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0\": container with ID starting with ec32e3d6107b78f665d8249a8578cac9540f7164bd8fde5403055ff27163ecc0 not found: ID does not exist" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.469520 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.479382 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 21:03:08 crc kubenswrapper[4669]: E1008 21:03:08.480683 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" containerName="kube-state-metrics" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.480706 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" containerName="kube-state-metrics" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.480908 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" containerName="kube-state-metrics" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.481554 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.486132 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.487090 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.497472 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.578988 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.579045 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.579121 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8ddb\" (UniqueName: \"kubernetes.io/projected/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-api-access-d8ddb\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.579166 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.680934 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.680989 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.681031 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8ddb\" (UniqueName: \"kubernetes.io/projected/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-api-access-d8ddb\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.681079 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.685458 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.685559 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.686355 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: E1008 21:03:08.686601 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 21:03:08 crc kubenswrapper[4669]: E1008 21:03:08.690901 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 21:03:08 crc kubenswrapper[4669]: E1008 21:03:08.695016 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 21:03:08 crc kubenswrapper[4669]: E1008 21:03:08.695048 4669 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" containerName="nova-scheduler-scheduler" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.701603 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8ddb\" (UniqueName: \"kubernetes.io/projected/8af63aa0-e5df-488c-b5e4-4677c9d0f2de-kube-api-access-d8ddb\") pod \"kube-state-metrics-0\" (UID: \"8af63aa0-e5df-488c-b5e4-4677c9d0f2de\") " pod="openstack/kube-state-metrics-0" Oct 08 21:03:08 crc kubenswrapper[4669]: I1008 21:03:08.806600 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.307773 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.322703 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.323013 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-central-agent" containerID="cri-o://601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25" gracePeriod=30 Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.323117 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="sg-core" containerID="cri-o://69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a" gracePeriod=30 Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.323199 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-notification-agent" containerID="cri-o://baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21" gracePeriod=30 Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.323265 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="proxy-httpd" containerID="cri-o://03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd" gracePeriod=30 Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.349168 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e47bce4-587b-4864-86ea-a1e2a7987779" path="/var/lib/kubelet/pods/1e47bce4-587b-4864-86ea-a1e2a7987779/volumes" Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.423044 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8af63aa0-e5df-488c-b5e4-4677c9d0f2de","Type":"ContainerStarted","Data":"e65c585ca852eb9d0911ebf1dacbd96676e2be9fc0cffb57534d1589f828646e"} Oct 08 21:03:09 crc kubenswrapper[4669]: I1008 21:03:09.904831 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.005492 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-config-data\") pod \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.005674 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvc5x\" (UniqueName: \"kubernetes.io/projected/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-kube-api-access-zvc5x\") pod \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.005704 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-combined-ca-bundle\") pod \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\" (UID: \"4bb5d744-260a-49cc-89bd-bc34cceb4f2e\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.010017 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-kube-api-access-zvc5x" (OuterVolumeSpecName: "kube-api-access-zvc5x") pod "4bb5d744-260a-49cc-89bd-bc34cceb4f2e" (UID: "4bb5d744-260a-49cc-89bd-bc34cceb4f2e"). InnerVolumeSpecName "kube-api-access-zvc5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.035900 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-config-data" (OuterVolumeSpecName: "config-data") pod "4bb5d744-260a-49cc-89bd-bc34cceb4f2e" (UID: "4bb5d744-260a-49cc-89bd-bc34cceb4f2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.041623 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bb5d744-260a-49cc-89bd-bc34cceb4f2e" (UID: "4bb5d744-260a-49cc-89bd-bc34cceb4f2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.108394 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.108430 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvc5x\" (UniqueName: \"kubernetes.io/projected/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-kube-api-access-zvc5x\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.108448 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5d744-260a-49cc-89bd-bc34cceb4f2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.396461 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.449090 4669 generic.go:334] "Generic (PLEG): container finished" podID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerID="23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca" exitCode=0 Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.449133 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a74fd913-c6c2-48c2-b283-e0fd296a24e8","Type":"ContainerDied","Data":"23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.449175 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.449200 4669 scope.go:117] "RemoveContainer" containerID="23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.449182 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a74fd913-c6c2-48c2-b283-e0fd296a24e8","Type":"ContainerDied","Data":"c1c4f4750edabb9d53a12eaa647ccf11d97ce11860219680475e9a96d055bd38"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.454209 4669 generic.go:334] "Generic (PLEG): container finished" podID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" exitCode=0 Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.454333 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bb5d744-260a-49cc-89bd-bc34cceb4f2e","Type":"ContainerDied","Data":"ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.454426 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bb5d744-260a-49cc-89bd-bc34cceb4f2e","Type":"ContainerDied","Data":"2380a04673b8dcdf5ace8c226785c46388868bf6e177a1d68c2d5cece3b12350"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.454626 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.458654 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerID="03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd" exitCode=0 Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.458676 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerID="69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a" exitCode=2 Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.458683 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerID="601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25" exitCode=0 Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.458683 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerDied","Data":"03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.458718 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerDied","Data":"69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.458735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerDied","Data":"601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.460457 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8af63aa0-e5df-488c-b5e4-4677c9d0f2de","Type":"ContainerStarted","Data":"15ae3b4380c17b0bd39e83e150fcc0e6631e8d051b747e92c73c29678e2f4d9f"} Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.460605 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.485187 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.105099646 podStartE2EDuration="2.485165362s" podCreationTimestamp="2025-10-08 21:03:08 +0000 UTC" firstStartedPulling="2025-10-08 21:03:09.293438385 +0000 UTC m=+1108.986249068" lastFinishedPulling="2025-10-08 21:03:09.673504111 +0000 UTC m=+1109.366314784" observedRunningTime="2025-10-08 21:03:10.479008754 +0000 UTC m=+1110.171819427" watchObservedRunningTime="2025-10-08 21:03:10.485165362 +0000 UTC m=+1110.177976035" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.488240 4669 scope.go:117] "RemoveContainer" containerID="2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.515734 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-combined-ca-bundle\") pod \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.515828 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74fd913-c6c2-48c2-b283-e0fd296a24e8-logs\") pod \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.515943 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-config-data\") pod \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.516063 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/a74fd913-c6c2-48c2-b283-e0fd296a24e8-kube-api-access-sntwv\") pod \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\" (UID: \"a74fd913-c6c2-48c2-b283-e0fd296a24e8\") " Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.518075 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a74fd913-c6c2-48c2-b283-e0fd296a24e8-logs" (OuterVolumeSpecName: "logs") pod "a74fd913-c6c2-48c2-b283-e0fd296a24e8" (UID: "a74fd913-c6c2-48c2-b283-e0fd296a24e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.524893 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74fd913-c6c2-48c2-b283-e0fd296a24e8-kube-api-access-sntwv" (OuterVolumeSpecName: "kube-api-access-sntwv") pod "a74fd913-c6c2-48c2-b283-e0fd296a24e8" (UID: "a74fd913-c6c2-48c2-b283-e0fd296a24e8"). InnerVolumeSpecName "kube-api-access-sntwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.525683 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.531133 4669 scope.go:117] "RemoveContainer" containerID="23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca" Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.531671 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca\": container with ID starting with 23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca not found: ID does not exist" containerID="23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.531740 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca"} err="failed to get container status \"23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca\": rpc error: code = NotFound desc = could not find container \"23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca\": container with ID starting with 23dd940b20e6b6e16b5eaf9273b656bc4dd6d52c8229780a2e9fd3118726bbca not found: ID does not exist" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.531785 4669 scope.go:117] "RemoveContainer" containerID="2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d" Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.532319 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d\": container with ID starting with 2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d not found: ID does not exist" containerID="2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.532372 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d"} err="failed to get container status \"2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d\": rpc error: code = NotFound desc = could not find container \"2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d\": container with ID starting with 2dbb75cd344cb4aa3fe867031ae6babdf228f3410009c526ca0aeb408565eb0d not found: ID does not exist" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.532409 4669 scope.go:117] "RemoveContainer" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.534324 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.543658 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.544211 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-api" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.544284 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-api" Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.544360 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" containerName="nova-scheduler-scheduler" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.544411 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" containerName="nova-scheduler-scheduler" Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.544466 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-log" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.544513 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-log" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.544846 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-api" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.544919 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" containerName="nova-scheduler-scheduler" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.544983 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" containerName="nova-api-log" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.545675 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.548108 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.553234 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.559516 4669 scope.go:117] "RemoveContainer" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.560046 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744\": container with ID starting with ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744 not found: ID does not exist" containerID="ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.560271 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744"} err="failed to get container status \"ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744\": rpc error: code = NotFound desc = could not find container \"ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744\": container with ID starting with ae8922e342b678e419de4842d7a38114a2b1572ace3b4887df4746409fae5744 not found: ID does not exist" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.565227 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a74fd913-c6c2-48c2-b283-e0fd296a24e8" (UID: "a74fd913-c6c2-48c2-b283-e0fd296a24e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.567381 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-config-data" (OuterVolumeSpecName: "config-data") pod "a74fd913-c6c2-48c2-b283-e0fd296a24e8" (UID: "a74fd913-c6c2-48c2-b283-e0fd296a24e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.617848 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-config-data\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.617885 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.617926 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt9q8\" (UniqueName: \"kubernetes.io/projected/b8283326-dc1d-4a16-b02e-c8d471367f25-kube-api-access-tt9q8\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.618243 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.618273 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/a74fd913-c6c2-48c2-b283-e0fd296a24e8-kube-api-access-sntwv\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.618283 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74fd913-c6c2-48c2-b283-e0fd296a24e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.618292 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74fd913-c6c2-48c2-b283-e0fd296a24e8-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:10 crc kubenswrapper[4669]: E1008 21:03:10.647028 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb5d744_260a_49cc_89bd_bc34cceb4f2e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb5d744_260a_49cc_89bd_bc34cceb4f2e.slice/crio-2380a04673b8dcdf5ace8c226785c46388868bf6e177a1d68c2d5cece3b12350\": RecentStats: unable to find data in memory cache]" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.720327 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-config-data\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.720384 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.720438 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt9q8\" (UniqueName: \"kubernetes.io/projected/b8283326-dc1d-4a16-b02e-c8d471367f25-kube-api-access-tt9q8\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.724518 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-config-data\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.725338 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.782257 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt9q8\" (UniqueName: \"kubernetes.io/projected/b8283326-dc1d-4a16-b02e-c8d471367f25-kube-api-access-tt9q8\") pod \"nova-scheduler-0\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.872045 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.899409 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.935959 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.944362 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.946138 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.949167 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 21:03:10 crc kubenswrapper[4669]: I1008 21:03:10.960426 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.025523 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-logs\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.025828 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.025860 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftn9n\" (UniqueName: \"kubernetes.io/projected/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-kube-api-access-ftn9n\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.025879 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-config-data\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.083013 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.083079 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.127861 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-logs\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.127907 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.127968 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftn9n\" (UniqueName: \"kubernetes.io/projected/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-kube-api-access-ftn9n\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.127998 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-config-data\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.128298 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-logs\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.135436 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.139181 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-config-data\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.148018 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftn9n\" (UniqueName: \"kubernetes.io/projected/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-kube-api-access-ftn9n\") pod \"nova-api-0\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.309411 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.372688 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb5d744-260a-49cc-89bd-bc34cceb4f2e" path="/var/lib/kubelet/pods/4bb5d744-260a-49cc-89bd-bc34cceb4f2e/volumes" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.373281 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74fd913-c6c2-48c2-b283-e0fd296a24e8" path="/var/lib/kubelet/pods/a74fd913-c6c2-48c2-b283-e0fd296a24e8/volumes" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.374851 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.381230 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436078 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-config-data\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436387 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-log-httpd\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436444 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-combined-ca-bundle\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436496 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-scripts\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436518 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2t9k\" (UniqueName: \"kubernetes.io/projected/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-kube-api-access-f2t9k\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436602 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-sg-core-conf-yaml\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.436625 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-run-httpd\") pod \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\" (UID: \"9d0475f8-aca1-4fd8-9453-f5272ff2e52c\") " Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.437084 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.438349 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.438834 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.445456 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-scripts" (OuterVolumeSpecName: "scripts") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.445507 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-kube-api-access-f2t9k" (OuterVolumeSpecName: "kube-api-access-f2t9k") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "kube-api-access-f2t9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.486502 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8283326-dc1d-4a16-b02e-c8d471367f25","Type":"ContainerStarted","Data":"0032bc649463905bf12b14fd888a7ea6b632d173e503e6e9816d6ff90920fb18"} Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.493903 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerID="baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21" exitCode=0 Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.493968 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.493982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerDied","Data":"baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21"} Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.494017 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d0475f8-aca1-4fd8-9453-f5272ff2e52c","Type":"ContainerDied","Data":"13d6f2b03f4a6ac3ef64b66ca6de644a341eb62ad71a3da89d2df2ccf1588ad6"} Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.494060 4669 scope.go:117] "RemoveContainer" containerID="03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.500664 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.528791 4669 scope.go:117] "RemoveContainer" containerID="69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.540265 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.540283 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2t9k\" (UniqueName: \"kubernetes.io/projected/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-kube-api-access-f2t9k\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.540293 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.540302 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.551827 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.566659 4669 scope.go:117] "RemoveContainer" containerID="baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.583495 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-config-data" (OuterVolumeSpecName: "config-data") pod "9d0475f8-aca1-4fd8-9453-f5272ff2e52c" (UID: "9d0475f8-aca1-4fd8-9453-f5272ff2e52c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.593330 4669 scope.go:117] "RemoveContainer" containerID="601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.614525 4669 scope.go:117] "RemoveContainer" containerID="03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.615268 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd\": container with ID starting with 03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd not found: ID does not exist" containerID="03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.615305 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd"} err="failed to get container status \"03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd\": rpc error: code = NotFound desc = could not find container \"03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd\": container with ID starting with 03768bffd87f223de2649b541be671cdd0b4f329f831c9ba1825487f1dcbcabd not found: ID does not exist" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.615324 4669 scope.go:117] "RemoveContainer" containerID="69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.615658 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a\": container with ID starting with 69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a not found: ID does not exist" containerID="69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.615681 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a"} err="failed to get container status \"69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a\": rpc error: code = NotFound desc = could not find container \"69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a\": container with ID starting with 69ae486352cddefd10f53088ff25854a19a012ea5dcd638cc24799f7da454c0a not found: ID does not exist" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.615693 4669 scope.go:117] "RemoveContainer" containerID="baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.616064 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21\": container with ID starting with baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21 not found: ID does not exist" containerID="baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.616086 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21"} err="failed to get container status \"baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21\": rpc error: code = NotFound desc = could not find container \"baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21\": container with ID starting with baee525e89992e6f05af170df9569e53a91d6bc34ea705c3811d26a11fdada21 not found: ID does not exist" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.616099 4669 scope.go:117] "RemoveContainer" containerID="601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.616303 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25\": container with ID starting with 601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25 not found: ID does not exist" containerID="601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.616321 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25"} err="failed to get container status \"601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25\": rpc error: code = NotFound desc = could not find container \"601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25\": container with ID starting with 601b68b4ce37a6e44d05e2e033973f9f08aa0da54764e1c1aa0c4b063999cc25 not found: ID does not exist" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.641969 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.642010 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0475f8-aca1-4fd8-9453-f5272ff2e52c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.770345 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:11 crc kubenswrapper[4669]: W1008 21:03:11.771663 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8becc6b6_ff40_4c3e_8f2e_23b11bd9ac62.slice/crio-91cf0492747d51d5574270b9aaef86696f620735955abf8ae7ff61b930cf9582 WatchSource:0}: Error finding container 91cf0492747d51d5574270b9aaef86696f620735955abf8ae7ff61b930cf9582: Status 404 returned error can't find the container with id 91cf0492747d51d5574270b9aaef86696f620735955abf8ae7ff61b930cf9582 Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.947862 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.956170 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.974391 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.975042 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-notification-agent" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975062 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-notification-agent" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.975075 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="sg-core" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975081 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="sg-core" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.975114 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="proxy-httpd" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975122 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="proxy-httpd" Oct 08 21:03:11 crc kubenswrapper[4669]: E1008 21:03:11.975138 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-central-agent" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975145 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-central-agent" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975383 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-notification-agent" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975419 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="proxy-httpd" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975442 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="ceilometer-central-agent" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.975458 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" containerName="sg-core" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.977617 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.981457 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.982006 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.982202 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:03:11 crc kubenswrapper[4669]: I1008 21:03:11.984416 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.048510 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-config-data\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.048560 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-run-httpd\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.048588 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.048614 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt8cc\" (UniqueName: \"kubernetes.io/projected/8b83b92c-e256-4a66-9815-27d5a9bb778b-kube-api-access-tt8cc\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.048838 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-scripts\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.048969 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-log-httpd\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.049025 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.049105 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.151131 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-config-data\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.151207 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-run-httpd\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.151267 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.151303 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt8cc\" (UniqueName: \"kubernetes.io/projected/8b83b92c-e256-4a66-9815-27d5a9bb778b-kube-api-access-tt8cc\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.151787 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-run-httpd\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.153093 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-scripts\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.153168 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-log-httpd\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.153203 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.153251 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.154247 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-log-httpd\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.156783 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.157253 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-config-data\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.158199 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-scripts\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.158753 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.160747 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.170277 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt8cc\" (UniqueName: \"kubernetes.io/projected/8b83b92c-e256-4a66-9815-27d5a9bb778b-kube-api-access-tt8cc\") pod \"ceilometer-0\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.308204 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.520572 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8283326-dc1d-4a16-b02e-c8d471367f25","Type":"ContainerStarted","Data":"c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7"} Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.548099 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62","Type":"ContainerStarted","Data":"231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe"} Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.548144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62","Type":"ContainerStarted","Data":"91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7"} Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.548158 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62","Type":"ContainerStarted","Data":"91cf0492747d51d5574270b9aaef86696f620735955abf8ae7ff61b930cf9582"} Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.550239 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.550220216 podStartE2EDuration="2.550220216s" podCreationTimestamp="2025-10-08 21:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:12.54375259 +0000 UTC m=+1112.236563263" watchObservedRunningTime="2025-10-08 21:03:12.550220216 +0000 UTC m=+1112.243030889" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.585622 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.585604134 podStartE2EDuration="2.585604134s" podCreationTimestamp="2025-10-08 21:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:12.570932833 +0000 UTC m=+1112.263743506" watchObservedRunningTime="2025-10-08 21:03:12.585604134 +0000 UTC m=+1112.278414807" Oct 08 21:03:12 crc kubenswrapper[4669]: I1008 21:03:12.780490 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:12 crc kubenswrapper[4669]: W1008 21:03:12.786897 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b83b92c_e256_4a66_9815_27d5a9bb778b.slice/crio-f6f065743f4b840ca13238b7d639f681f0656df4f6984618bc6141ff6f337635 WatchSource:0}: Error finding container f6f065743f4b840ca13238b7d639f681f0656df4f6984618bc6141ff6f337635: Status 404 returned error can't find the container with id f6f065743f4b840ca13238b7d639f681f0656df4f6984618bc6141ff6f337635 Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.185778 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.185838 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.185882 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.186738 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc8abec09504bb79a99269d94867e82d2072a920f3270921c1c4a731ac29aaaf"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.186797 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://fc8abec09504bb79a99269d94867e82d2072a920f3270921c1c4a731ac29aaaf" gracePeriod=600 Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.344653 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0475f8-aca1-4fd8-9453-f5272ff2e52c" path="/var/lib/kubelet/pods/9d0475f8-aca1-4fd8-9453-f5272ff2e52c/volumes" Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.565839 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="fc8abec09504bb79a99269d94867e82d2072a920f3270921c1c4a731ac29aaaf" exitCode=0 Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.565915 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"fc8abec09504bb79a99269d94867e82d2072a920f3270921c1c4a731ac29aaaf"} Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.566234 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"7d9c1843fe5022c993d21347e4a9434b43afdaabe05d722d7ab0e85541c9821e"} Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.566253 4669 scope.go:117] "RemoveContainer" containerID="e212469b959f799f6dd101756cbc798d4bd5c61d90207df29dcf3db6ccbd05d1" Oct 08 21:03:13 crc kubenswrapper[4669]: I1008 21:03:13.569408 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerStarted","Data":"f6f065743f4b840ca13238b7d639f681f0656df4f6984618bc6141ff6f337635"} Oct 08 21:03:14 crc kubenswrapper[4669]: I1008 21:03:14.582147 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerStarted","Data":"bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75"} Oct 08 21:03:14 crc kubenswrapper[4669]: I1008 21:03:14.582696 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerStarted","Data":"50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9"} Oct 08 21:03:14 crc kubenswrapper[4669]: I1008 21:03:14.788600 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 21:03:15 crc kubenswrapper[4669]: I1008 21:03:15.593089 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerStarted","Data":"621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f"} Oct 08 21:03:15 crc kubenswrapper[4669]: I1008 21:03:15.872863 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 21:03:16 crc kubenswrapper[4669]: I1008 21:03:16.084087 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 21:03:16 crc kubenswrapper[4669]: I1008 21:03:16.084360 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 21:03:17 crc kubenswrapper[4669]: I1008 21:03:17.098717 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 21:03:17 crc kubenswrapper[4669]: I1008 21:03:17.098717 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 21:03:17 crc kubenswrapper[4669]: I1008 21:03:17.614950 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerStarted","Data":"b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab"} Oct 08 21:03:17 crc kubenswrapper[4669]: I1008 21:03:17.615219 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:03:18 crc kubenswrapper[4669]: I1008 21:03:18.824939 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 21:03:18 crc kubenswrapper[4669]: I1008 21:03:18.871167 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.197504 podStartE2EDuration="7.871135112s" podCreationTimestamp="2025-10-08 21:03:11 +0000 UTC" firstStartedPulling="2025-10-08 21:03:12.789355762 +0000 UTC m=+1112.482166435" lastFinishedPulling="2025-10-08 21:03:16.462986864 +0000 UTC m=+1116.155797547" observedRunningTime="2025-10-08 21:03:17.64226067 +0000 UTC m=+1117.335071363" watchObservedRunningTime="2025-10-08 21:03:18.871135112 +0000 UTC m=+1118.563945835" Oct 08 21:03:20 crc kubenswrapper[4669]: I1008 21:03:20.872568 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 21:03:20 crc kubenswrapper[4669]: I1008 21:03:20.906515 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 21:03:21 crc kubenswrapper[4669]: I1008 21:03:21.310643 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 21:03:21 crc kubenswrapper[4669]: I1008 21:03:21.311011 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 21:03:21 crc kubenswrapper[4669]: I1008 21:03:21.708699 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 21:03:22 crc kubenswrapper[4669]: I1008 21:03:22.394718 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:03:22 crc kubenswrapper[4669]: I1008 21:03:22.394754 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:03:26 crc kubenswrapper[4669]: I1008 21:03:26.089428 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 21:03:26 crc kubenswrapper[4669]: I1008 21:03:26.095353 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 21:03:26 crc kubenswrapper[4669]: I1008 21:03:26.095438 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 21:03:26 crc kubenswrapper[4669]: I1008 21:03:26.728286 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.631349 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.749390 4669 generic.go:334] "Generic (PLEG): container finished" podID="35a11691-df3b-4aeb-ad42-a8d4347eeb3f" containerID="a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f" exitCode=137 Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.749460 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a11691-df3b-4aeb-ad42-a8d4347eeb3f","Type":"ContainerDied","Data":"a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f"} Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.749495 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a11691-df3b-4aeb-ad42-a8d4347eeb3f","Type":"ContainerDied","Data":"1475137c06069be97af4207086e0e07eb65b0c869c256014427d9b48e1bf0806"} Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.749514 4669 scope.go:117] "RemoveContainer" containerID="a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.749465 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.763910 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-combined-ca-bundle\") pod \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.764295 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-config-data\") pod \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.764438 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjgx\" (UniqueName: \"kubernetes.io/projected/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-kube-api-access-dqjgx\") pod \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\" (UID: \"35a11691-df3b-4aeb-ad42-a8d4347eeb3f\") " Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.769405 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-kube-api-access-dqjgx" (OuterVolumeSpecName: "kube-api-access-dqjgx") pod "35a11691-df3b-4aeb-ad42-a8d4347eeb3f" (UID: "35a11691-df3b-4aeb-ad42-a8d4347eeb3f"). InnerVolumeSpecName "kube-api-access-dqjgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.781136 4669 scope.go:117] "RemoveContainer" containerID="a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f" Oct 08 21:03:28 crc kubenswrapper[4669]: E1008 21:03:28.781759 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f\": container with ID starting with a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f not found: ID does not exist" containerID="a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.781836 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f"} err="failed to get container status \"a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f\": rpc error: code = NotFound desc = could not find container \"a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f\": container with ID starting with a41bb82cbaebdb51c1b4cf75ebbb094acb0c283db000b45fc3ed68446deaf71f not found: ID does not exist" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.794821 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-config-data" (OuterVolumeSpecName: "config-data") pod "35a11691-df3b-4aeb-ad42-a8d4347eeb3f" (UID: "35a11691-df3b-4aeb-ad42-a8d4347eeb3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.805797 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35a11691-df3b-4aeb-ad42-a8d4347eeb3f" (UID: "35a11691-df3b-4aeb-ad42-a8d4347eeb3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.867623 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqjgx\" (UniqueName: \"kubernetes.io/projected/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-kube-api-access-dqjgx\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.867656 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:28 crc kubenswrapper[4669]: I1008 21:03:28.867670 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a11691-df3b-4aeb-ad42-a8d4347eeb3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.092777 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.112745 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.121154 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:03:29 crc kubenswrapper[4669]: E1008 21:03:29.121732 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a11691-df3b-4aeb-ad42-a8d4347eeb3f" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.121761 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a11691-df3b-4aeb-ad42-a8d4347eeb3f" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.122086 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a11691-df3b-4aeb-ad42-a8d4347eeb3f" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.123150 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.127738 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.127911 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.128034 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.134793 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.172022 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.172122 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbj8\" (UniqueName: \"kubernetes.io/projected/915f256e-a280-4726-8987-df1df9f8e4b5-kube-api-access-kcbj8\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.172187 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.172208 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.172228 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.273108 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.273470 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbj8\" (UniqueName: \"kubernetes.io/projected/915f256e-a280-4726-8987-df1df9f8e4b5-kube-api-access-kcbj8\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.273624 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.273705 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.273872 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.277814 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.277823 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.279757 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.280199 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/915f256e-a280-4726-8987-df1df9f8e4b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.290232 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbj8\" (UniqueName: \"kubernetes.io/projected/915f256e-a280-4726-8987-df1df9f8e4b5-kube-api-access-kcbj8\") pod \"nova-cell1-novncproxy-0\" (UID: \"915f256e-a280-4726-8987-df1df9f8e4b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.352157 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a11691-df3b-4aeb-ad42-a8d4347eeb3f" path="/var/lib/kubelet/pods/35a11691-df3b-4aeb-ad42-a8d4347eeb3f/volumes" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.450122 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:29 crc kubenswrapper[4669]: I1008 21:03:29.927813 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 21:03:30 crc kubenswrapper[4669]: I1008 21:03:30.775756 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"915f256e-a280-4726-8987-df1df9f8e4b5","Type":"ContainerStarted","Data":"4f2707e43a3b05b32c39ad8bd5594c0cd6c10aa6fd6578e55954a9ff30409ab9"} Oct 08 21:03:30 crc kubenswrapper[4669]: I1008 21:03:30.776230 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"915f256e-a280-4726-8987-df1df9f8e4b5","Type":"ContainerStarted","Data":"6d783588e927e81969923fc43e06dbfa69560f3eb009a9708aae9079a8305bc8"} Oct 08 21:03:30 crc kubenswrapper[4669]: I1008 21:03:30.824393 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.824370397 podStartE2EDuration="1.824370397s" podCreationTimestamp="2025-10-08 21:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:30.812929875 +0000 UTC m=+1130.505740588" watchObservedRunningTime="2025-10-08 21:03:30.824370397 +0000 UTC m=+1130.517181080" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.313229 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.313599 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.314120 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.314384 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.316958 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.318812 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.561277 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-vtsv5"] Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.563387 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.576325 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-vtsv5"] Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.618222 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.618664 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.618766 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gwk\" (UniqueName: \"kubernetes.io/projected/052a11d2-f410-40af-a1e8-a61ba3203811-kube-api-access-b9gwk\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.618800 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.618826 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-config\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.618849 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.719876 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.720339 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gwk\" (UniqueName: \"kubernetes.io/projected/052a11d2-f410-40af-a1e8-a61ba3203811-kube-api-access-b9gwk\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.720510 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.720664 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-config\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.720794 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.720942 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.720997 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.721288 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.721579 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-config\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.721824 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.722025 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.751751 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gwk\" (UniqueName: \"kubernetes.io/projected/052a11d2-f410-40af-a1e8-a61ba3203811-kube-api-access-b9gwk\") pod \"dnsmasq-dns-5c7b6c5df9-vtsv5\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:31 crc kubenswrapper[4669]: I1008 21:03:31.888445 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:32 crc kubenswrapper[4669]: I1008 21:03:32.372427 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-vtsv5"] Oct 08 21:03:32 crc kubenswrapper[4669]: I1008 21:03:32.795454 4669 generic.go:334] "Generic (PLEG): container finished" podID="052a11d2-f410-40af-a1e8-a61ba3203811" containerID="a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4" exitCode=0 Oct 08 21:03:32 crc kubenswrapper[4669]: I1008 21:03:32.795503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" event={"ID":"052a11d2-f410-40af-a1e8-a61ba3203811","Type":"ContainerDied","Data":"a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4"} Oct 08 21:03:32 crc kubenswrapper[4669]: I1008 21:03:32.795795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" event={"ID":"052a11d2-f410-40af-a1e8-a61ba3203811","Type":"ContainerStarted","Data":"4a4036d6c2065510b1a323407764d7279a1180bc1970d7fde7a4b19627229a24"} Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.266468 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.267060 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-central-agent" containerID="cri-o://50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9" gracePeriod=30 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.267139 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="proxy-httpd" containerID="cri-o://b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab" gracePeriod=30 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.267159 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-notification-agent" containerID="cri-o://bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75" gracePeriod=30 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.267183 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="sg-core" containerID="cri-o://621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f" gracePeriod=30 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.282478 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.805157 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" event={"ID":"052a11d2-f410-40af-a1e8-a61ba3203811","Type":"ContainerStarted","Data":"7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe"} Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.805552 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.808058 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerID="b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab" exitCode=0 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.808093 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerID="621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f" exitCode=2 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.808108 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerID="50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9" exitCode=0 Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.808140 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerDied","Data":"b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab"} Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.808179 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerDied","Data":"621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f"} Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.808198 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerDied","Data":"50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9"} Oct 08 21:03:33 crc kubenswrapper[4669]: I1008 21:03:33.830401 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" podStartSLOduration=2.830382345 podStartE2EDuration="2.830382345s" podCreationTimestamp="2025-10-08 21:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:33.826931371 +0000 UTC m=+1133.519742044" watchObservedRunningTime="2025-10-08 21:03:33.830382345 +0000 UTC m=+1133.523193018" Oct 08 21:03:34 crc kubenswrapper[4669]: I1008 21:03:34.023176 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:34 crc kubenswrapper[4669]: I1008 21:03:34.023613 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-log" containerID="cri-o://91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7" gracePeriod=30 Oct 08 21:03:34 crc kubenswrapper[4669]: I1008 21:03:34.023693 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-api" containerID="cri-o://231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe" gracePeriod=30 Oct 08 21:03:34 crc kubenswrapper[4669]: I1008 21:03:34.455266 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:34 crc kubenswrapper[4669]: I1008 21:03:34.820576 4669 generic.go:334] "Generic (PLEG): container finished" podID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerID="91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7" exitCode=143 Oct 08 21:03:34 crc kubenswrapper[4669]: I1008 21:03:34.820661 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62","Type":"ContainerDied","Data":"91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7"} Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.821341 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.831169 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerID="bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75" exitCode=0 Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.831216 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerDied","Data":"bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75"} Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.831250 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b83b92c-e256-4a66-9815-27d5a9bb778b","Type":"ContainerDied","Data":"f6f065743f4b840ca13238b7d639f681f0656df4f6984618bc6141ff6f337635"} Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.831270 4669 scope.go:117] "RemoveContainer" containerID="b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.831434 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.858440 4669 scope.go:117] "RemoveContainer" containerID="621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.881629 4669 scope.go:117] "RemoveContainer" containerID="bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.906379 4669 scope.go:117] "RemoveContainer" containerID="50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912295 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-log-httpd\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912379 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-ceilometer-tls-certs\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912579 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-config-data\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912645 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt8cc\" (UniqueName: \"kubernetes.io/projected/8b83b92c-e256-4a66-9815-27d5a9bb778b-kube-api-access-tt8cc\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912678 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-combined-ca-bundle\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912713 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-run-httpd\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912761 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-scripts\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.912861 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-sg-core-conf-yaml\") pod \"8b83b92c-e256-4a66-9815-27d5a9bb778b\" (UID: \"8b83b92c-e256-4a66-9815-27d5a9bb778b\") " Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.913919 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.914046 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.918676 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-scripts" (OuterVolumeSpecName: "scripts") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.923253 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b83b92c-e256-4a66-9815-27d5a9bb778b-kube-api-access-tt8cc" (OuterVolumeSpecName: "kube-api-access-tt8cc") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "kube-api-access-tt8cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.929771 4669 scope.go:117] "RemoveContainer" containerID="b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab" Oct 08 21:03:35 crc kubenswrapper[4669]: E1008 21:03:35.934648 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab\": container with ID starting with b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab not found: ID does not exist" containerID="b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.934756 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab"} err="failed to get container status \"b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab\": rpc error: code = NotFound desc = could not find container \"b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab\": container with ID starting with b5ef79f906cf5a7eb8db0043e94cc1c5f63d70e15861c1389a838ed7be0d33ab not found: ID does not exist" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.934851 4669 scope.go:117] "RemoveContainer" containerID="621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f" Oct 08 21:03:35 crc kubenswrapper[4669]: E1008 21:03:35.935226 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f\": container with ID starting with 621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f not found: ID does not exist" containerID="621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.935501 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f"} err="failed to get container status \"621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f\": rpc error: code = NotFound desc = could not find container \"621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f\": container with ID starting with 621545cb423a2fc362fa9881e131d83b7f2237b5f91253d64cd03bc95125254f not found: ID does not exist" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.935605 4669 scope.go:117] "RemoveContainer" containerID="bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75" Oct 08 21:03:35 crc kubenswrapper[4669]: E1008 21:03:35.937155 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75\": container with ID starting with bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75 not found: ID does not exist" containerID="bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.937261 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75"} err="failed to get container status \"bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75\": rpc error: code = NotFound desc = could not find container \"bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75\": container with ID starting with bb7eca025017ec3ad12278b3ca043127df09eb9836b9bbbe6af5f81ca1355a75 not found: ID does not exist" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.937331 4669 scope.go:117] "RemoveContainer" containerID="50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9" Oct 08 21:03:35 crc kubenswrapper[4669]: E1008 21:03:35.938232 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9\": container with ID starting with 50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9 not found: ID does not exist" containerID="50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.938286 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9"} err="failed to get container status \"50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9\": rpc error: code = NotFound desc = could not find container \"50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9\": container with ID starting with 50563944703db5a36974ec131b1bd036b06618cdd3b9d70ccc1709b83d8f91d9 not found: ID does not exist" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.940988 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.971577 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:35 crc kubenswrapper[4669]: I1008 21:03:35.984101 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016257 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016296 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016310 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016323 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt8cc\" (UniqueName: \"kubernetes.io/projected/8b83b92c-e256-4a66-9815-27d5a9bb778b-kube-api-access-tt8cc\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016337 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016348 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b83b92c-e256-4a66-9815-27d5a9bb778b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016359 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.016751 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-config-data" (OuterVolumeSpecName: "config-data") pod "8b83b92c-e256-4a66-9815-27d5a9bb778b" (UID: "8b83b92c-e256-4a66-9815-27d5a9bb778b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.117762 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b83b92c-e256-4a66-9815-27d5a9bb778b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.160898 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.169222 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.192608 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:36 crc kubenswrapper[4669]: E1008 21:03:36.193048 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-central-agent" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193077 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-central-agent" Oct 08 21:03:36 crc kubenswrapper[4669]: E1008 21:03:36.193101 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="proxy-httpd" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193108 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="proxy-httpd" Oct 08 21:03:36 crc kubenswrapper[4669]: E1008 21:03:36.193139 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-notification-agent" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193146 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-notification-agent" Oct 08 21:03:36 crc kubenswrapper[4669]: E1008 21:03:36.193153 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="sg-core" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193159 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="sg-core" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193312 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-notification-agent" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193335 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="ceilometer-central-agent" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193348 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="proxy-httpd" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.193358 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" containerName="sg-core" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.195293 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.197246 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.198376 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.203801 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.218195 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219083 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219121 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219159 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-config-data\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219205 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-scripts\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219245 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-run-httpd\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219273 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-log-httpd\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219303 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjzl4\" (UniqueName: \"kubernetes.io/projected/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-kube-api-access-zjzl4\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.219360 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321373 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-log-httpd\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321434 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjzl4\" (UniqueName: \"kubernetes.io/projected/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-kube-api-access-zjzl4\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321504 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321600 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321625 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321660 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-config-data\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321701 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-scripts\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321739 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-run-httpd\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.321858 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-log-httpd\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.322019 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-run-httpd\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.326971 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-scripts\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.327039 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.327271 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.327754 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.328300 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-config-data\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.340620 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjzl4\" (UniqueName: \"kubernetes.io/projected/5e6cd0ba-8231-4bc6-bab5-83f4b8740c01-kube-api-access-zjzl4\") pod \"ceilometer-0\" (UID: \"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01\") " pod="openstack/ceilometer-0" Oct 08 21:03:36 crc kubenswrapper[4669]: I1008 21:03:36.513356 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.019725 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 21:03:37 crc kubenswrapper[4669]: W1008 21:03:37.022443 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e6cd0ba_8231_4bc6_bab5_83f4b8740c01.slice/crio-a838594b1e0b9b452ac8d42964165482f0ca7c56f5362972038cfb5bfcafd3cd WatchSource:0}: Error finding container a838594b1e0b9b452ac8d42964165482f0ca7c56f5362972038cfb5bfcafd3cd: Status 404 returned error can't find the container with id a838594b1e0b9b452ac8d42964165482f0ca7c56f5362972038cfb5bfcafd3cd Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.342484 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b83b92c-e256-4a66-9815-27d5a9bb778b" path="/var/lib/kubelet/pods/8b83b92c-e256-4a66-9815-27d5a9bb778b/volumes" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.721600 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.857826 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01","Type":"ContainerStarted","Data":"a838594b1e0b9b452ac8d42964165482f0ca7c56f5362972038cfb5bfcafd3cd"} Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.860271 4669 generic.go:334] "Generic (PLEG): container finished" podID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerID="231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe" exitCode=0 Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.860309 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62","Type":"ContainerDied","Data":"231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe"} Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.860330 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62","Type":"ContainerDied","Data":"91cf0492747d51d5574270b9aaef86696f620735955abf8ae7ff61b930cf9582"} Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.860351 4669 scope.go:117] "RemoveContainer" containerID="231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.860479 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.862384 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftn9n\" (UniqueName: \"kubernetes.io/projected/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-kube-api-access-ftn9n\") pod \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.862479 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-combined-ca-bundle\") pod \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.862633 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-config-data\") pod \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.862689 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-logs\") pod \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\" (UID: \"8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62\") " Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.864603 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-logs" (OuterVolumeSpecName: "logs") pod "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" (UID: "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.871766 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-kube-api-access-ftn9n" (OuterVolumeSpecName: "kube-api-access-ftn9n") pod "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" (UID: "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62"). InnerVolumeSpecName "kube-api-access-ftn9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.902978 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" (UID: "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.903628 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-config-data" (OuterVolumeSpecName: "config-data") pod "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" (UID: "8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.924051 4669 scope.go:117] "RemoveContainer" containerID="91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.948643 4669 scope.go:117] "RemoveContainer" containerID="231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe" Oct 08 21:03:37 crc kubenswrapper[4669]: E1008 21:03:37.949443 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe\": container with ID starting with 231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe not found: ID does not exist" containerID="231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.949523 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe"} err="failed to get container status \"231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe\": rpc error: code = NotFound desc = could not find container \"231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe\": container with ID starting with 231f446273b138df5ea86737982fec718cab6214c48ac421537eb1eeaa31e2fe not found: ID does not exist" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.949569 4669 scope.go:117] "RemoveContainer" containerID="91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7" Oct 08 21:03:37 crc kubenswrapper[4669]: E1008 21:03:37.949868 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7\": container with ID starting with 91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7 not found: ID does not exist" containerID="91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.949902 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7"} err="failed to get container status \"91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7\": rpc error: code = NotFound desc = could not find container \"91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7\": container with ID starting with 91c1a30f06d58911609faedbb706e8c17c9f53c1a3b744ccad2f348c217ac5e7 not found: ID does not exist" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.964715 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.964752 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.964762 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftn9n\" (UniqueName: \"kubernetes.io/projected/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-kube-api-access-ftn9n\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:37 crc kubenswrapper[4669]: I1008 21:03:37.964774 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.192280 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.199088 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.222581 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:38 crc kubenswrapper[4669]: E1008 21:03:38.222955 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-api" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.222970 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-api" Oct 08 21:03:38 crc kubenswrapper[4669]: E1008 21:03:38.223008 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-log" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.223015 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-log" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.223206 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-api" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.223275 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" containerName="nova-api-log" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.224228 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.227878 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.227930 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.228078 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.231227 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.372392 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.372442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.372490 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-config-data\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.372629 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.372749 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7364db5-5324-45dc-bba7-13c44cfc9efb-logs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.372797 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n88v\" (UniqueName: \"kubernetes.io/projected/d7364db5-5324-45dc-bba7-13c44cfc9efb-kube-api-access-6n88v\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.475149 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.475236 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.475332 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-config-data\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.475373 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.475429 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7364db5-5324-45dc-bba7-13c44cfc9efb-logs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.475474 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n88v\" (UniqueName: \"kubernetes.io/projected/d7364db5-5324-45dc-bba7-13c44cfc9efb-kube-api-access-6n88v\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.477147 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7364db5-5324-45dc-bba7-13c44cfc9efb-logs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.486667 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-config-data\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.489826 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.492012 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.495165 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.512048 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n88v\" (UniqueName: \"kubernetes.io/projected/d7364db5-5324-45dc-bba7-13c44cfc9efb-kube-api-access-6n88v\") pod \"nova-api-0\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.542559 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.871922 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01","Type":"ContainerStarted","Data":"d2f2c6f83cfc1708652108e7e43f74e863bb5e1b0fa2c06910ef026b3be38451"} Oct 08 21:03:38 crc kubenswrapper[4669]: I1008 21:03:38.872267 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01","Type":"ContainerStarted","Data":"231c91f9cc21832589745b7b6f4506db4d88c5104c0ff7adcda45bb1bd08a6a5"} Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.037134 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.343967 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62" path="/var/lib/kubelet/pods/8becc6b6-ff40-4c3e-8f2e-23b11bd9ac62/volumes" Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.451196 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.468050 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.885763 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01","Type":"ContainerStarted","Data":"046e78b33087a99ec003e66810abf45afaa566a8f33758f0a7dcdccd777a7f48"} Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.889654 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7364db5-5324-45dc-bba7-13c44cfc9efb","Type":"ContainerStarted","Data":"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9"} Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.889749 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7364db5-5324-45dc-bba7-13c44cfc9efb","Type":"ContainerStarted","Data":"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c"} Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.889772 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7364db5-5324-45dc-bba7-13c44cfc9efb","Type":"ContainerStarted","Data":"81309adef2f566c8df769ccdd08b9db1015bc2dff39372755734748adf5bb1d3"} Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.906402 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 21:03:39 crc kubenswrapper[4669]: I1008 21:03:39.911786 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.911758705 podStartE2EDuration="1.911758705s" podCreationTimestamp="2025-10-08 21:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:39.906912233 +0000 UTC m=+1139.599722946" watchObservedRunningTime="2025-10-08 21:03:39.911758705 +0000 UTC m=+1139.604569398" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.082637 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8p7zt"] Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.084552 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.086936 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.087208 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.096966 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8p7zt"] Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.208183 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.208283 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwt9w\" (UniqueName: \"kubernetes.io/projected/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-kube-api-access-qwt9w\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.208312 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-config-data\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.208356 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-scripts\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.310629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-scripts\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.310754 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.310851 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwt9w\" (UniqueName: \"kubernetes.io/projected/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-kube-api-access-qwt9w\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.310877 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-config-data\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.316074 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-scripts\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.316297 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.316516 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-config-data\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.328831 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwt9w\" (UniqueName: \"kubernetes.io/projected/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-kube-api-access-qwt9w\") pod \"nova-cell1-cell-mapping-8p7zt\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.403184 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:40 crc kubenswrapper[4669]: I1008 21:03:40.895902 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8p7zt"] Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.889722 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.905516 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8p7zt" event={"ID":"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c","Type":"ContainerStarted","Data":"abd9f109a524eacf76eb809d428126fda858bae591a1abadc18528b1791e5682"} Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.905896 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8p7zt" event={"ID":"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c","Type":"ContainerStarted","Data":"2cda689cfec5361b77388c8e873fc72deb3d4691295e4e5eb9cee5636a184631"} Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.909806 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e6cd0ba-8231-4bc6-bab5-83f4b8740c01","Type":"ContainerStarted","Data":"6390a09ce2912edcade7052e6dd4df37e2e962b5e0ad02104d3be73a540ccfe8"} Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.932488 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.12366083 podStartE2EDuration="5.932472487s" podCreationTimestamp="2025-10-08 21:03:36 +0000 UTC" firstStartedPulling="2025-10-08 21:03:37.025780278 +0000 UTC m=+1136.718590961" lastFinishedPulling="2025-10-08 21:03:40.834591945 +0000 UTC m=+1140.527402618" observedRunningTime="2025-10-08 21:03:41.92965972 +0000 UTC m=+1141.622470393" watchObservedRunningTime="2025-10-08 21:03:41.932472487 +0000 UTC m=+1141.625283160" Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.951745 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8p7zt" podStartSLOduration=1.951725772 podStartE2EDuration="1.951725772s" podCreationTimestamp="2025-10-08 21:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:41.944515726 +0000 UTC m=+1141.637326399" watchObservedRunningTime="2025-10-08 21:03:41.951725772 +0000 UTC m=+1141.644536455" Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.963036 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mvqwc"] Oct 08 21:03:41 crc kubenswrapper[4669]: I1008 21:03:41.963262 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerName="dnsmasq-dns" containerID="cri-o://f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75" gracePeriod=10 Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.547570 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.656720 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-sb\") pod \"b0e2762f-41e9-48e0-aa45-0827f176c311\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.656836 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87kq6\" (UniqueName: \"kubernetes.io/projected/b0e2762f-41e9-48e0-aa45-0827f176c311-kube-api-access-87kq6\") pod \"b0e2762f-41e9-48e0-aa45-0827f176c311\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.656881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-nb\") pod \"b0e2762f-41e9-48e0-aa45-0827f176c311\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.656996 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-swift-storage-0\") pod \"b0e2762f-41e9-48e0-aa45-0827f176c311\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.657058 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-svc\") pod \"b0e2762f-41e9-48e0-aa45-0827f176c311\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.657119 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-config\") pod \"b0e2762f-41e9-48e0-aa45-0827f176c311\" (UID: \"b0e2762f-41e9-48e0-aa45-0827f176c311\") " Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.662863 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e2762f-41e9-48e0-aa45-0827f176c311-kube-api-access-87kq6" (OuterVolumeSpecName: "kube-api-access-87kq6") pod "b0e2762f-41e9-48e0-aa45-0827f176c311" (UID: "b0e2762f-41e9-48e0-aa45-0827f176c311"). InnerVolumeSpecName "kube-api-access-87kq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.707179 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-config" (OuterVolumeSpecName: "config") pod "b0e2762f-41e9-48e0-aa45-0827f176c311" (UID: "b0e2762f-41e9-48e0-aa45-0827f176c311"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.710008 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0e2762f-41e9-48e0-aa45-0827f176c311" (UID: "b0e2762f-41e9-48e0-aa45-0827f176c311"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.710911 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0e2762f-41e9-48e0-aa45-0827f176c311" (UID: "b0e2762f-41e9-48e0-aa45-0827f176c311"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.716728 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0e2762f-41e9-48e0-aa45-0827f176c311" (UID: "b0e2762f-41e9-48e0-aa45-0827f176c311"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.732100 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b0e2762f-41e9-48e0-aa45-0827f176c311" (UID: "b0e2762f-41e9-48e0-aa45-0827f176c311"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.759798 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.759842 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.759852 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.759860 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.759869 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87kq6\" (UniqueName: \"kubernetes.io/projected/b0e2762f-41e9-48e0-aa45-0827f176c311-kube-api-access-87kq6\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.759881 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e2762f-41e9-48e0-aa45-0827f176c311-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.924754 4669 generic.go:334] "Generic (PLEG): container finished" podID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerID="f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75" exitCode=0 Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.926107 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.926669 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" event={"ID":"b0e2762f-41e9-48e0-aa45-0827f176c311","Type":"ContainerDied","Data":"f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75"} Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.926726 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mvqwc" event={"ID":"b0e2762f-41e9-48e0-aa45-0827f176c311","Type":"ContainerDied","Data":"79f80c4428b31e99cdedc2068a8bfa06dbe8550b7bde2eede660a1e3989a39d5"} Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.926756 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.926947 4669 scope.go:117] "RemoveContainer" containerID="f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.957604 4669 scope.go:117] "RemoveContainer" containerID="21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818" Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.980755 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mvqwc"] Oct 08 21:03:42 crc kubenswrapper[4669]: I1008 21:03:42.987599 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mvqwc"] Oct 08 21:03:43 crc kubenswrapper[4669]: I1008 21:03:43.000688 4669 scope.go:117] "RemoveContainer" containerID="f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75" Oct 08 21:03:43 crc kubenswrapper[4669]: E1008 21:03:43.001138 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75\": container with ID starting with f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75 not found: ID does not exist" containerID="f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75" Oct 08 21:03:43 crc kubenswrapper[4669]: I1008 21:03:43.001166 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75"} err="failed to get container status \"f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75\": rpc error: code = NotFound desc = could not find container \"f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75\": container with ID starting with f1cbd5195b48c4af0ade1df1306ba404182668d34f81fe72cca2d900d63cff75 not found: ID does not exist" Oct 08 21:03:43 crc kubenswrapper[4669]: I1008 21:03:43.001189 4669 scope.go:117] "RemoveContainer" containerID="21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818" Oct 08 21:03:43 crc kubenswrapper[4669]: E1008 21:03:43.001709 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818\": container with ID starting with 21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818 not found: ID does not exist" containerID="21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818" Oct 08 21:03:43 crc kubenswrapper[4669]: I1008 21:03:43.001741 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818"} err="failed to get container status \"21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818\": rpc error: code = NotFound desc = could not find container \"21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818\": container with ID starting with 21d1f11f5a343c6701b32a406c30e8e15e5828d8d73781ff65c3a435f38aa818 not found: ID does not exist" Oct 08 21:03:43 crc kubenswrapper[4669]: I1008 21:03:43.344881 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" path="/var/lib/kubelet/pods/b0e2762f-41e9-48e0-aa45-0827f176c311/volumes" Oct 08 21:03:45 crc kubenswrapper[4669]: I1008 21:03:45.956573 4669 generic.go:334] "Generic (PLEG): container finished" podID="7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" containerID="abd9f109a524eacf76eb809d428126fda858bae591a1abadc18528b1791e5682" exitCode=0 Oct 08 21:03:45 crc kubenswrapper[4669]: I1008 21:03:45.956647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8p7zt" event={"ID":"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c","Type":"ContainerDied","Data":"abd9f109a524eacf76eb809d428126fda858bae591a1abadc18528b1791e5682"} Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.477924 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.564776 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-scripts\") pod \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.564914 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwt9w\" (UniqueName: \"kubernetes.io/projected/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-kube-api-access-qwt9w\") pod \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.565114 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-combined-ca-bundle\") pod \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.565288 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-config-data\") pod \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\" (UID: \"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c\") " Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.579901 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-kube-api-access-qwt9w" (OuterVolumeSpecName: "kube-api-access-qwt9w") pod "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" (UID: "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c"). InnerVolumeSpecName "kube-api-access-qwt9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.579962 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-scripts" (OuterVolumeSpecName: "scripts") pod "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" (UID: "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.612814 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-config-data" (OuterVolumeSpecName: "config-data") pod "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" (UID: "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.613444 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" (UID: "7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.667575 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwt9w\" (UniqueName: \"kubernetes.io/projected/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-kube-api-access-qwt9w\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.667835 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.667958 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.668045 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.981664 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8p7zt" event={"ID":"7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c","Type":"ContainerDied","Data":"2cda689cfec5361b77388c8e873fc72deb3d4691295e4e5eb9cee5636a184631"} Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.981701 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cda689cfec5361b77388c8e873fc72deb3d4691295e4e5eb9cee5636a184631" Oct 08 21:03:47 crc kubenswrapper[4669]: I1008 21:03:47.981767 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8p7zt" Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.265910 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.266202 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b8283326-dc1d-4a16-b02e-c8d471367f25" containerName="nova-scheduler-scheduler" containerID="cri-o://c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7" gracePeriod=30 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.274627 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.275189 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-log" containerID="cri-o://0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c" gracePeriod=30 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.275266 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-api" containerID="cri-o://36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9" gracePeriod=30 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.299083 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.303914 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-log" containerID="cri-o://307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a" gracePeriod=30 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.304003 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-metadata" containerID="cri-o://8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce" gracePeriod=30 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.875571 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990507 4669 generic.go:334] "Generic (PLEG): container finished" podID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerID="36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9" exitCode=0 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990574 4669 generic.go:334] "Generic (PLEG): container finished" podID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerID="0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c" exitCode=143 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990617 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7364db5-5324-45dc-bba7-13c44cfc9efb","Type":"ContainerDied","Data":"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9"} Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7364db5-5324-45dc-bba7-13c44cfc9efb","Type":"ContainerDied","Data":"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c"} Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990653 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7364db5-5324-45dc-bba7-13c44cfc9efb","Type":"ContainerDied","Data":"81309adef2f566c8df769ccdd08b9db1015bc2dff39372755734748adf5bb1d3"} Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990667 4669 scope.go:117] "RemoveContainer" containerID="36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9" Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.990766 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.991757 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-internal-tls-certs\") pod \"d7364db5-5324-45dc-bba7-13c44cfc9efb\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.991891 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-public-tls-certs\") pod \"d7364db5-5324-45dc-bba7-13c44cfc9efb\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.991955 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-config-data\") pod \"d7364db5-5324-45dc-bba7-13c44cfc9efb\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.992086 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7364db5-5324-45dc-bba7-13c44cfc9efb-logs\") pod \"d7364db5-5324-45dc-bba7-13c44cfc9efb\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.992113 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n88v\" (UniqueName: \"kubernetes.io/projected/d7364db5-5324-45dc-bba7-13c44cfc9efb-kube-api-access-6n88v\") pod \"d7364db5-5324-45dc-bba7-13c44cfc9efb\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.992135 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-combined-ca-bundle\") pod \"d7364db5-5324-45dc-bba7-13c44cfc9efb\" (UID: \"d7364db5-5324-45dc-bba7-13c44cfc9efb\") " Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.992485 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7364db5-5324-45dc-bba7-13c44cfc9efb-logs" (OuterVolumeSpecName: "logs") pod "d7364db5-5324-45dc-bba7-13c44cfc9efb" (UID: "d7364db5-5324-45dc-bba7-13c44cfc9efb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.994063 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7364db5-5324-45dc-bba7-13c44cfc9efb-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.995625 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerID="307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a" exitCode=143 Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.995688 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4838748-5f6b-4ca4-a828-150c8213ce8e","Type":"ContainerDied","Data":"307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a"} Oct 08 21:03:48 crc kubenswrapper[4669]: I1008 21:03:48.998065 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7364db5-5324-45dc-bba7-13c44cfc9efb-kube-api-access-6n88v" (OuterVolumeSpecName: "kube-api-access-6n88v") pod "d7364db5-5324-45dc-bba7-13c44cfc9efb" (UID: "d7364db5-5324-45dc-bba7-13c44cfc9efb"). InnerVolumeSpecName "kube-api-access-6n88v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.018079 4669 scope.go:117] "RemoveContainer" containerID="0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.025300 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7364db5-5324-45dc-bba7-13c44cfc9efb" (UID: "d7364db5-5324-45dc-bba7-13c44cfc9efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.035396 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-config-data" (OuterVolumeSpecName: "config-data") pod "d7364db5-5324-45dc-bba7-13c44cfc9efb" (UID: "d7364db5-5324-45dc-bba7-13c44cfc9efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.042607 4669 scope.go:117] "RemoveContainer" containerID="36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9" Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.043045 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9\": container with ID starting with 36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9 not found: ID does not exist" containerID="36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043086 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9"} err="failed to get container status \"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9\": rpc error: code = NotFound desc = could not find container \"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9\": container with ID starting with 36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9 not found: ID does not exist" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043108 4669 scope.go:117] "RemoveContainer" containerID="0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c" Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.043402 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c\": container with ID starting with 0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c not found: ID does not exist" containerID="0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043426 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c"} err="failed to get container status \"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c\": rpc error: code = NotFound desc = could not find container \"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c\": container with ID starting with 0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c not found: ID does not exist" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043439 4669 scope.go:117] "RemoveContainer" containerID="36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043730 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9"} err="failed to get container status \"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9\": rpc error: code = NotFound desc = could not find container \"36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9\": container with ID starting with 36c406151432dc5afede7f1fc652def04671faf609c4dc1d12a5a8a054a0b8d9 not found: ID does not exist" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043750 4669 scope.go:117] "RemoveContainer" containerID="0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.043964 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c"} err="failed to get container status \"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c\": rpc error: code = NotFound desc = could not find container \"0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c\": container with ID starting with 0ec7972fe277248b40f0ed52d6cc2c7c71f13e550188244f32acbdb5a0e6d98c not found: ID does not exist" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.062856 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d7364db5-5324-45dc-bba7-13c44cfc9efb" (UID: "d7364db5-5324-45dc-bba7-13c44cfc9efb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.063103 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d7364db5-5324-45dc-bba7-13c44cfc9efb" (UID: "d7364db5-5324-45dc-bba7-13c44cfc9efb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.096245 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n88v\" (UniqueName: \"kubernetes.io/projected/d7364db5-5324-45dc-bba7-13c44cfc9efb-kube-api-access-6n88v\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.096288 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.096301 4669 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.096312 4669 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.096322 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7364db5-5324-45dc-bba7-13c44cfc9efb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.325695 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.342988 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.354190 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.354693 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-log" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.354716 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-log" Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.354726 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" containerName="nova-manage" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.354733 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" containerName="nova-manage" Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.354756 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-api" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.354764 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-api" Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.354776 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerName="dnsmasq-dns" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.354785 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerName="dnsmasq-dns" Oct 08 21:03:49 crc kubenswrapper[4669]: E1008 21:03:49.354806 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerName="init" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.354813 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerName="init" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.355037 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" containerName="nova-manage" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.355071 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-api" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.355082 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" containerName="nova-api-log" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.355095 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e2762f-41e9-48e0-aa45-0827f176c311" containerName="dnsmasq-dns" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.356436 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.359679 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.359838 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.360298 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.364654 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.400931 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.401309 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.401357 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvp5r\" (UniqueName: \"kubernetes.io/projected/26677aac-fbac-4ec9-972c-e22c276549f2-kube-api-access-jvp5r\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.401437 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26677aac-fbac-4ec9-972c-e22c276549f2-logs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.401555 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.401625 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-config-data\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.503695 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-config-data\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.503841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.503930 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.503963 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvp5r\" (UniqueName: \"kubernetes.io/projected/26677aac-fbac-4ec9-972c-e22c276549f2-kube-api-access-jvp5r\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.503997 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26677aac-fbac-4ec9-972c-e22c276549f2-logs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.504034 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.504618 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26677aac-fbac-4ec9-972c-e22c276549f2-logs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.520161 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-config-data\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.534087 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.538020 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvp5r\" (UniqueName: \"kubernetes.io/projected/26677aac-fbac-4ec9-972c-e22c276549f2-kube-api-access-jvp5r\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.538093 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.538393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26677aac-fbac-4ec9-972c-e22c276549f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"26677aac-fbac-4ec9-972c-e22c276549f2\") " pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.673945 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 21:03:49 crc kubenswrapper[4669]: I1008 21:03:49.987084 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.011643 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-combined-ca-bundle\") pod \"b8283326-dc1d-4a16-b02e-c8d471367f25\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.011757 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-config-data\") pod \"b8283326-dc1d-4a16-b02e-c8d471367f25\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.011802 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt9q8\" (UniqueName: \"kubernetes.io/projected/b8283326-dc1d-4a16-b02e-c8d471367f25-kube-api-access-tt9q8\") pod \"b8283326-dc1d-4a16-b02e-c8d471367f25\" (UID: \"b8283326-dc1d-4a16-b02e-c8d471367f25\") " Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.018573 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8283326-dc1d-4a16-b02e-c8d471367f25-kube-api-access-tt9q8" (OuterVolumeSpecName: "kube-api-access-tt9q8") pod "b8283326-dc1d-4a16-b02e-c8d471367f25" (UID: "b8283326-dc1d-4a16-b02e-c8d471367f25"). InnerVolumeSpecName "kube-api-access-tt9q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.020892 4669 generic.go:334] "Generic (PLEG): container finished" podID="b8283326-dc1d-4a16-b02e-c8d471367f25" containerID="c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7" exitCode=0 Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.020954 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8283326-dc1d-4a16-b02e-c8d471367f25","Type":"ContainerDied","Data":"c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7"} Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.020982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8283326-dc1d-4a16-b02e-c8d471367f25","Type":"ContainerDied","Data":"0032bc649463905bf12b14fd888a7ea6b632d173e503e6e9816d6ff90920fb18"} Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.020999 4669 scope.go:117] "RemoveContainer" containerID="c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.021089 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.041734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-config-data" (OuterVolumeSpecName: "config-data") pod "b8283326-dc1d-4a16-b02e-c8d471367f25" (UID: "b8283326-dc1d-4a16-b02e-c8d471367f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.049776 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8283326-dc1d-4a16-b02e-c8d471367f25" (UID: "b8283326-dc1d-4a16-b02e-c8d471367f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.050048 4669 scope.go:117] "RemoveContainer" containerID="c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7" Oct 08 21:03:50 crc kubenswrapper[4669]: E1008 21:03:50.050457 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7\": container with ID starting with c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7 not found: ID does not exist" containerID="c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.050518 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7"} err="failed to get container status \"c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7\": rpc error: code = NotFound desc = could not find container \"c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7\": container with ID starting with c047c5d65e2e46f4521a2df058d6273cfa8a8570080c58dc0093afed8e9f50a7 not found: ID does not exist" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.114025 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.114062 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8283326-dc1d-4a16-b02e-c8d471367f25-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.114071 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt9q8\" (UniqueName: \"kubernetes.io/projected/b8283326-dc1d-4a16-b02e-c8d471367f25-kube-api-access-tt9q8\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.156664 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.358604 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.367952 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.385672 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:50 crc kubenswrapper[4669]: E1008 21:03:50.386116 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8283326-dc1d-4a16-b02e-c8d471367f25" containerName="nova-scheduler-scheduler" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.386133 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8283326-dc1d-4a16-b02e-c8d471367f25" containerName="nova-scheduler-scheduler" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.386278 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8283326-dc1d-4a16-b02e-c8d471367f25" containerName="nova-scheduler-scheduler" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.386905 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.389999 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.395256 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.522280 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89cdc120-36f5-4203-b064-4300a8249a64-config-data\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.522353 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvmj\" (UniqueName: \"kubernetes.io/projected/89cdc120-36f5-4203-b064-4300a8249a64-kube-api-access-ktvmj\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.522541 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89cdc120-36f5-4203-b064-4300a8249a64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.624817 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89cdc120-36f5-4203-b064-4300a8249a64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.624992 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89cdc120-36f5-4203-b064-4300a8249a64-config-data\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.625033 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvmj\" (UniqueName: \"kubernetes.io/projected/89cdc120-36f5-4203-b064-4300a8249a64-kube-api-access-ktvmj\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.629750 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89cdc120-36f5-4203-b064-4300a8249a64-config-data\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.630127 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89cdc120-36f5-4203-b064-4300a8249a64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.640730 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvmj\" (UniqueName: \"kubernetes.io/projected/89cdc120-36f5-4203-b064-4300a8249a64-kube-api-access-ktvmj\") pod \"nova-scheduler-0\" (UID: \"89cdc120-36f5-4203-b064-4300a8249a64\") " pod="openstack/nova-scheduler-0" Oct 08 21:03:50 crc kubenswrapper[4669]: I1008 21:03:50.811569 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.036614 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26677aac-fbac-4ec9-972c-e22c276549f2","Type":"ContainerStarted","Data":"e5a61bdecd5e1a47c8315fae87fdaaeb1f4799dde3624a53436ea2d3cc34dd9a"} Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.036662 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26677aac-fbac-4ec9-972c-e22c276549f2","Type":"ContainerStarted","Data":"f49492a00c3d84b8cd526457339abfd8bf357143adcac69d414d0661e5a983cc"} Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.036675 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26677aac-fbac-4ec9-972c-e22c276549f2","Type":"ContainerStarted","Data":"b61edab097275a5e1b9b5fdc4615d4e151a298870fc715c97f01117943f6a102"} Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.067467 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.067442934 podStartE2EDuration="2.067442934s" podCreationTimestamp="2025-10-08 21:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:51.057427311 +0000 UTC m=+1150.750237984" watchObservedRunningTime="2025-10-08 21:03:51.067442934 +0000 UTC m=+1150.760253617" Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.278955 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.348831 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8283326-dc1d-4a16-b02e-c8d471367f25" path="/var/lib/kubelet/pods/b8283326-dc1d-4a16-b02e-c8d471367f25/volumes" Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.353313 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7364db5-5324-45dc-bba7-13c44cfc9efb" path="/var/lib/kubelet/pods/d7364db5-5324-45dc-bba7-13c44cfc9efb/volumes" Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.445254 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:35040->10.217.0.196:8775: read: connection reset by peer" Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.445246 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:35038->10.217.0.196:8775: read: connection reset by peer" Oct 08 21:03:51 crc kubenswrapper[4669]: I1008 21:03:51.891254 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.049215 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4838748-5f6b-4ca4-a828-150c8213ce8e-logs\") pod \"b4838748-5f6b-4ca4-a828-150c8213ce8e\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.049292 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-config-data\") pod \"b4838748-5f6b-4ca4-a828-150c8213ce8e\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.049380 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-combined-ca-bundle\") pod \"b4838748-5f6b-4ca4-a828-150c8213ce8e\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.049473 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lccd5\" (UniqueName: \"kubernetes.io/projected/b4838748-5f6b-4ca4-a828-150c8213ce8e-kube-api-access-lccd5\") pod \"b4838748-5f6b-4ca4-a828-150c8213ce8e\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.049636 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-nova-metadata-tls-certs\") pod \"b4838748-5f6b-4ca4-a828-150c8213ce8e\" (UID: \"b4838748-5f6b-4ca4-a828-150c8213ce8e\") " Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.049793 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4838748-5f6b-4ca4-a828-150c8213ce8e-logs" (OuterVolumeSpecName: "logs") pod "b4838748-5f6b-4ca4-a828-150c8213ce8e" (UID: "b4838748-5f6b-4ca4-a828-150c8213ce8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.050440 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"89cdc120-36f5-4203-b064-4300a8249a64","Type":"ContainerStarted","Data":"07cba9845f169cc432ed6aef43735779a56bc187f44ac335225fe47dd576d727"} Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.050480 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"89cdc120-36f5-4203-b064-4300a8249a64","Type":"ContainerStarted","Data":"aa8ad24322df8f8ffc8773b0d64fa04622834103aa32f5c994e3e6a16177fc94"} Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.055440 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4838748-5f6b-4ca4-a828-150c8213ce8e-kube-api-access-lccd5" (OuterVolumeSpecName: "kube-api-access-lccd5") pod "b4838748-5f6b-4ca4-a828-150c8213ce8e" (UID: "b4838748-5f6b-4ca4-a828-150c8213ce8e"). InnerVolumeSpecName "kube-api-access-lccd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.060482 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4838748-5f6b-4ca4-a828-150c8213ce8e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.060538 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lccd5\" (UniqueName: \"kubernetes.io/projected/b4838748-5f6b-4ca4-a828-150c8213ce8e-kube-api-access-lccd5\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.061978 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerID="8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce" exitCode=0 Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.062011 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.062037 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4838748-5f6b-4ca4-a828-150c8213ce8e","Type":"ContainerDied","Data":"8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce"} Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.062069 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4838748-5f6b-4ca4-a828-150c8213ce8e","Type":"ContainerDied","Data":"0c5f35b0706309ac6ac71f67cc6e94012eedef7b880accb9a0a36d1d2d4210b1"} Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.062088 4669 scope.go:117] "RemoveContainer" containerID="8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.073710 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.073686973 podStartE2EDuration="2.073686973s" podCreationTimestamp="2025-10-08 21:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:52.06588942 +0000 UTC m=+1151.758700093" watchObservedRunningTime="2025-10-08 21:03:52.073686973 +0000 UTC m=+1151.766497646" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.086670 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4838748-5f6b-4ca4-a828-150c8213ce8e" (UID: "b4838748-5f6b-4ca4-a828-150c8213ce8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.087069 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-config-data" (OuterVolumeSpecName: "config-data") pod "b4838748-5f6b-4ca4-a828-150c8213ce8e" (UID: "b4838748-5f6b-4ca4-a828-150c8213ce8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.124159 4669 scope.go:117] "RemoveContainer" containerID="307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.126915 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b4838748-5f6b-4ca4-a828-150c8213ce8e" (UID: "b4838748-5f6b-4ca4-a828-150c8213ce8e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.154265 4669 scope.go:117] "RemoveContainer" containerID="8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce" Oct 08 21:03:52 crc kubenswrapper[4669]: E1008 21:03:52.164169 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce\": container with ID starting with 8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce not found: ID does not exist" containerID="8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.164436 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce"} err="failed to get container status \"8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce\": rpc error: code = NotFound desc = could not find container \"8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce\": container with ID starting with 8be69866a732ffea8191978e1828ed1cf02fae30b571748593020e5c32b89dce not found: ID does not exist" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.164557 4669 scope.go:117] "RemoveContainer" containerID="307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a" Oct 08 21:03:52 crc kubenswrapper[4669]: E1008 21:03:52.165428 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a\": container with ID starting with 307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a not found: ID does not exist" containerID="307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.165485 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a"} err="failed to get container status \"307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a\": rpc error: code = NotFound desc = could not find container \"307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a\": container with ID starting with 307c4a31d0307311a14a41924910329c0a92f9d633afdfb4b3e46de6be1f5a0a not found: ID does not exist" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.165885 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.165943 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.165961 4669 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4838748-5f6b-4ca4-a828-150c8213ce8e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.466032 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.478704 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.490210 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:52 crc kubenswrapper[4669]: E1008 21:03:52.490757 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-log" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.490779 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-log" Oct 08 21:03:52 crc kubenswrapper[4669]: E1008 21:03:52.490804 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-metadata" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.490812 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-metadata" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.491037 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-log" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.491068 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" containerName="nova-metadata-metadata" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.492266 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.499695 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.541540 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.541712 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.587774 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f57d146-c85b-4beb-a614-1ea878e175b4-logs\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.588106 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.588335 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5h7p\" (UniqueName: \"kubernetes.io/projected/0f57d146-c85b-4beb-a614-1ea878e175b4-kube-api-access-q5h7p\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.588411 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-config-data\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.588459 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.689597 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5h7p\" (UniqueName: \"kubernetes.io/projected/0f57d146-c85b-4beb-a614-1ea878e175b4-kube-api-access-q5h7p\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.689891 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-config-data\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.690008 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.690157 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f57d146-c85b-4beb-a614-1ea878e175b4-logs\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.690389 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.690788 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f57d146-c85b-4beb-a614-1ea878e175b4-logs\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.694442 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.695049 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-config-data\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.705298 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5h7p\" (UniqueName: \"kubernetes.io/projected/0f57d146-c85b-4beb-a614-1ea878e175b4-kube-api-access-q5h7p\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.705337 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f57d146-c85b-4beb-a614-1ea878e175b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0f57d146-c85b-4beb-a614-1ea878e175b4\") " pod="openstack/nova-metadata-0" Oct 08 21:03:52 crc kubenswrapper[4669]: I1008 21:03:52.860877 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 21:03:53 crc kubenswrapper[4669]: I1008 21:03:53.342627 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4838748-5f6b-4ca4-a828-150c8213ce8e" path="/var/lib/kubelet/pods/b4838748-5f6b-4ca4-a828-150c8213ce8e/volumes" Oct 08 21:03:53 crc kubenswrapper[4669]: I1008 21:03:53.383069 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 21:03:54 crc kubenswrapper[4669]: I1008 21:03:54.082238 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0f57d146-c85b-4beb-a614-1ea878e175b4","Type":"ContainerStarted","Data":"3e19bd811734f6baa8554bf6fda120efdbef3cec1d15b5483f3a0c0e9a909160"} Oct 08 21:03:54 crc kubenswrapper[4669]: I1008 21:03:54.082760 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0f57d146-c85b-4beb-a614-1ea878e175b4","Type":"ContainerStarted","Data":"fed9204a712ac5d61f6221bc3b93c811ad279efcfd224ac4e4618c3efa1919b6"} Oct 08 21:03:54 crc kubenswrapper[4669]: I1008 21:03:54.082825 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0f57d146-c85b-4beb-a614-1ea878e175b4","Type":"ContainerStarted","Data":"07c2eb7a4f2bcb872cd53dee816d73c5bdc9075ee77f6a0f9b57fd02336aff82"} Oct 08 21:03:54 crc kubenswrapper[4669]: I1008 21:03:54.120240 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.12022152 podStartE2EDuration="2.12022152s" podCreationTimestamp="2025-10-08 21:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:03:54.111331217 +0000 UTC m=+1153.804141890" watchObservedRunningTime="2025-10-08 21:03:54.12022152 +0000 UTC m=+1153.813032193" Oct 08 21:03:55 crc kubenswrapper[4669]: I1008 21:03:55.811905 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 21:03:57 crc kubenswrapper[4669]: I1008 21:03:57.861933 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 21:03:57 crc kubenswrapper[4669]: I1008 21:03:57.862338 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 21:03:59 crc kubenswrapper[4669]: I1008 21:03:59.674045 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 21:03:59 crc kubenswrapper[4669]: I1008 21:03:59.674091 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 21:04:00 crc kubenswrapper[4669]: I1008 21:04:00.686702 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="26677aac-fbac-4ec9-972c-e22c276549f2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 21:04:00 crc kubenswrapper[4669]: I1008 21:04:00.686714 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="26677aac-fbac-4ec9-972c-e22c276549f2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 21:04:00 crc kubenswrapper[4669]: I1008 21:04:00.812314 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 21:04:00 crc kubenswrapper[4669]: I1008 21:04:00.846018 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 21:04:01 crc kubenswrapper[4669]: I1008 21:04:01.207989 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 21:04:02 crc kubenswrapper[4669]: I1008 21:04:02.862275 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 21:04:02 crc kubenswrapper[4669]: I1008 21:04:02.863231 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 21:04:03 crc kubenswrapper[4669]: I1008 21:04:03.877836 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0f57d146-c85b-4beb-a614-1ea878e175b4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 21:04:03 crc kubenswrapper[4669]: I1008 21:04:03.877836 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0f57d146-c85b-4beb-a614-1ea878e175b4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 21:04:06 crc kubenswrapper[4669]: I1008 21:04:06.530514 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 21:04:09 crc kubenswrapper[4669]: I1008 21:04:09.680413 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 21:04:09 crc kubenswrapper[4669]: I1008 21:04:09.681415 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 21:04:09 crc kubenswrapper[4669]: I1008 21:04:09.688053 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 21:04:09 crc kubenswrapper[4669]: I1008 21:04:09.691314 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 21:04:10 crc kubenswrapper[4669]: I1008 21:04:10.257240 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 21:04:10 crc kubenswrapper[4669]: I1008 21:04:10.270964 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 21:04:12 crc kubenswrapper[4669]: I1008 21:04:12.871989 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 21:04:12 crc kubenswrapper[4669]: I1008 21:04:12.874010 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 21:04:12 crc kubenswrapper[4669]: I1008 21:04:12.887234 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 21:04:13 crc kubenswrapper[4669]: I1008 21:04:13.313663 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 21:04:21 crc kubenswrapper[4669]: I1008 21:04:21.507786 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 21:04:22 crc kubenswrapper[4669]: I1008 21:04:22.443124 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 21:04:25 crc kubenswrapper[4669]: I1008 21:04:25.601419 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="rabbitmq" containerID="cri-o://24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb" gracePeriod=604796 Oct 08 21:04:26 crc kubenswrapper[4669]: I1008 21:04:26.064266 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="rabbitmq" containerID="cri-o://9fd8fb3e04505e8ce2e2875bca7667bbfaaef9a8494247534bd6d63df3691a48" gracePeriod=604797 Oct 08 21:04:28 crc kubenswrapper[4669]: I1008 21:04:28.144149 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 08 21:04:28 crc kubenswrapper[4669]: I1008 21:04:28.455703 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.393817 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.501395 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4f648df-77f7-4480-8b46-3f776880db17" containerID="9fd8fb3e04505e8ce2e2875bca7667bbfaaef9a8494247534bd6d63df3691a48" exitCode=0 Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.501473 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4f648df-77f7-4480-8b46-3f776880db17","Type":"ContainerDied","Data":"9fd8fb3e04505e8ce2e2875bca7667bbfaaef9a8494247534bd6d63df3691a48"} Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.503798 4669 generic.go:334] "Generic (PLEG): container finished" podID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerID="24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb" exitCode=0 Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.503843 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a63d3545-a64d-4c9a-9198-bf11fc782cc6","Type":"ContainerDied","Data":"24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb"} Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.503873 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a63d3545-a64d-4c9a-9198-bf11fc782cc6","Type":"ContainerDied","Data":"ea510e3f21b6ae24cc778d1fc614b2526a2beb3039e7dcbadc25581d5d0b5656"} Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.503891 4669 scope.go:117] "RemoveContainer" containerID="24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.504014 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.554428 4669 scope.go:117] "RemoveContainer" containerID="75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.578768 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-plugins\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.578835 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-server-conf\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.578892 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a63d3545-a64d-4c9a-9198-bf11fc782cc6-pod-info\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.578928 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l5w7\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-kube-api-access-5l5w7\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.578977 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.579036 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a63d3545-a64d-4c9a-9198-bf11fc782cc6-erlang-cookie-secret\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.579067 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-tls\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.579165 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-confd\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.579220 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-plugins-conf\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.579250 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-config-data\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.579306 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-erlang-cookie\") pod \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\" (UID: \"a63d3545-a64d-4c9a-9198-bf11fc782cc6\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.580314 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.581009 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.581391 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.585591 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.594746 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a63d3545-a64d-4c9a-9198-bf11fc782cc6-pod-info" (OuterVolumeSpecName: "pod-info") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.597105 4669 scope.go:117] "RemoveContainer" containerID="24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb" Oct 08 21:04:32 crc kubenswrapper[4669]: E1008 21:04:32.600038 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb\": container with ID starting with 24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb not found: ID does not exist" containerID="24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.600101 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb"} err="failed to get container status \"24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb\": rpc error: code = NotFound desc = could not find container \"24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb\": container with ID starting with 24c4c8a6872491732e4bea46f87649abfc6c7c74749c4b05fd8c0d0abecff3fb not found: ID does not exist" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.600136 4669 scope.go:117] "RemoveContainer" containerID="75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.601755 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.602929 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63d3545-a64d-4c9a-9198-bf11fc782cc6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: E1008 21:04:32.605719 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab\": container with ID starting with 75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab not found: ID does not exist" containerID="75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.605778 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab"} err="failed to get container status \"75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab\": rpc error: code = NotFound desc = could not find container \"75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab\": container with ID starting with 75c93ee37543d1d800805efc4adf4932d8c7190981cfcea370d27111c6ead9ab not found: ID does not exist" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.617133 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-kube-api-access-5l5w7" (OuterVolumeSpecName: "kube-api-access-5l5w7") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "kube-api-access-5l5w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.650643 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.663128 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-config-data" (OuterVolumeSpecName: "config-data") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.681878 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688000 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-plugins\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688249 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-server-conf\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688368 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-confd\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688451 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-config-data\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688577 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-plugins-conf\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688711 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-tls\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688794 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4f648df-77f7-4480-8b46-3f776880db17-erlang-cookie-secret\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.690491 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-erlang-cookie\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.690624 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4f648df-77f7-4480-8b46-3f776880db17-pod-info\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.690745 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzdxt\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-kube-api-access-qzdxt\") pod \"b4f648df-77f7-4480-8b46-3f776880db17\" (UID: \"b4f648df-77f7-4480-8b46-3f776880db17\") " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.691441 4669 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a63d3545-a64d-4c9a-9198-bf11fc782cc6-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.691940 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l5w7\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-kube-api-access-5l5w7\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.692048 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.692111 4669 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a63d3545-a64d-4c9a-9198-bf11fc782cc6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.692434 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.692608 4669 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.692677 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.692757 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.697485 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.688917 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.691651 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.697983 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.699117 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.705260 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f648df-77f7-4480-8b46-3f776880db17-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.707282 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-server-conf" (OuterVolumeSpecName: "server-conf") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.714031 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.719331 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b4f648df-77f7-4480-8b46-3f776880db17-pod-info" (OuterVolumeSpecName: "pod-info") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.721946 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-kube-api-access-qzdxt" (OuterVolumeSpecName: "kube-api-access-qzdxt") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "kube-api-access-qzdxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.743598 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.744014 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-config-data" (OuterVolumeSpecName: "config-data") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.767639 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a63d3545-a64d-4c9a-9198-bf11fc782cc6" (UID: "a63d3545-a64d-4c9a-9198-bf11fc782cc6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799496 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzdxt\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-kube-api-access-qzdxt\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799573 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799590 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799604 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a63d3545-a64d-4c9a-9198-bf11fc782cc6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799615 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799625 4669 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799635 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799646 4669 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4f648df-77f7-4480-8b46-3f776880db17-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799657 4669 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a63d3545-a64d-4c9a-9198-bf11fc782cc6-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799670 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799684 4669 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4f648df-77f7-4480-8b46-3f776880db17-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.799695 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.822624 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.844841 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.846339 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-server-conf" (OuterVolumeSpecName: "server-conf") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.855476 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b4f648df-77f7-4480-8b46-3f776880db17" (UID: "b4f648df-77f7-4480-8b46-3f776880db17"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.867797 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882037 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 21:04:32 crc kubenswrapper[4669]: E1008 21:04:32.882654 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="setup-container" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882669 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="setup-container" Oct 08 21:04:32 crc kubenswrapper[4669]: E1008 21:04:32.882678 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="setup-container" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882684 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="setup-container" Oct 08 21:04:32 crc kubenswrapper[4669]: E1008 21:04:32.882708 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="rabbitmq" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882714 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="rabbitmq" Oct 08 21:04:32 crc kubenswrapper[4669]: E1008 21:04:32.882727 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="rabbitmq" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882733 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="rabbitmq" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882903 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f648df-77f7-4480-8b46-3f776880db17" containerName="rabbitmq" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.882920 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" containerName="rabbitmq" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.883876 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.890476 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.890664 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zwm8p" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.890693 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.890743 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.890824 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.890869 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.891010 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.891926 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.909658 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.909767 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bfeeb02-715e-4358-802c-ce7ed6721a30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.909898 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-config-data\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.909998 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910046 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910070 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910121 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bfeeb02-715e-4358-802c-ce7ed6721a30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910289 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910317 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910359 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910443 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddzdt\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-kube-api-access-ddzdt\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910583 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910600 4669 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4f648df-77f7-4480-8b46-3f776880db17-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:32 crc kubenswrapper[4669]: I1008 21:04:32.910613 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4f648df-77f7-4480-8b46-3f776880db17-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.012936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.012993 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013022 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013057 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bfeeb02-715e-4358-802c-ce7ed6721a30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013137 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013160 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013193 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013232 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddzdt\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-kube-api-access-ddzdt\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013291 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013329 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bfeeb02-715e-4358-802c-ce7ed6721a30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013376 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-config-data\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.013921 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.014011 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.014215 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-config-data\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.014356 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.014458 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.020442 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bfeeb02-715e-4358-802c-ce7ed6721a30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.022929 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.023192 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bfeeb02-715e-4358-802c-ce7ed6721a30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.024564 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.028584 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bfeeb02-715e-4358-802c-ce7ed6721a30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.035694 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddzdt\" (UniqueName: \"kubernetes.io/projected/0bfeeb02-715e-4358-802c-ce7ed6721a30-kube-api-access-ddzdt\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.050601 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"0bfeeb02-715e-4358-802c-ce7ed6721a30\") " pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.297300 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.359897 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63d3545-a64d-4c9a-9198-bf11fc782cc6" path="/var/lib/kubelet/pods/a63d3545-a64d-4c9a-9198-bf11fc782cc6/volumes" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.515975 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4f648df-77f7-4480-8b46-3f776880db17","Type":"ContainerDied","Data":"9f3f60c96a2147ab91100ddcdad902c4ffab6669f24bc9e5d263269918f80e82"} Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.516024 4669 scope.go:117] "RemoveContainer" containerID="9fd8fb3e04505e8ce2e2875bca7667bbfaaef9a8494247534bd6d63df3691a48" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.516136 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.547845 4669 scope.go:117] "RemoveContainer" containerID="cd3fd70fc2d5d3707c6cba8aa9868955a85e808c3f7e26a6168a2bbcffde2d37" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.556794 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.592361 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.619991 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.622110 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.625359 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.625600 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.625783 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.625853 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.625815 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xqglp" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.626349 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.631737 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.635362 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.755476 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 21:04:33 crc kubenswrapper[4669]: W1008 21:04:33.763957 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bfeeb02_715e_4358_802c_ce7ed6721a30.slice/crio-d7764addc2eb4a3370d565d0e42ebd96397b44763bee8ce916f9fdf5fa2dcbb6 WatchSource:0}: Error finding container d7764addc2eb4a3370d565d0e42ebd96397b44763bee8ce916f9fdf5fa2dcbb6: Status 404 returned error can't find the container with id d7764addc2eb4a3370d565d0e42ebd96397b44763bee8ce916f9fdf5fa2dcbb6 Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828553 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e35f189-cd14-4892-a4d6-25a23a2ae04c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828652 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828695 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828733 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828765 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828838 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.828887 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.829130 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lk4t\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-kube-api-access-2lk4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.829191 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.829212 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e35f189-cd14-4892-a4d6-25a23a2ae04c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.829245 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e35f189-cd14-4892-a4d6-25a23a2ae04c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931338 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931464 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931598 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931719 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931803 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.931825 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932025 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932159 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lk4t\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-kube-api-access-2lk4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932270 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932370 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e35f189-cd14-4892-a4d6-25a23a2ae04c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932465 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932283 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.932935 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.933059 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.933278 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.935452 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e35f189-cd14-4892-a4d6-25a23a2ae04c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.936779 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.937491 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e35f189-cd14-4892-a4d6-25a23a2ae04c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.938219 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.940679 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e35f189-cd14-4892-a4d6-25a23a2ae04c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.964650 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lk4t\" (UniqueName: \"kubernetes.io/projected/8e35f189-cd14-4892-a4d6-25a23a2ae04c-kube-api-access-2lk4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:33 crc kubenswrapper[4669]: I1008 21:04:33.976120 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e35f189-cd14-4892-a4d6-25a23a2ae04c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:34 crc kubenswrapper[4669]: I1008 21:04:34.256348 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:04:34 crc kubenswrapper[4669]: I1008 21:04:34.531855 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0bfeeb02-715e-4358-802c-ce7ed6721a30","Type":"ContainerStarted","Data":"d7764addc2eb4a3370d565d0e42ebd96397b44763bee8ce916f9fdf5fa2dcbb6"} Oct 08 21:04:34 crc kubenswrapper[4669]: I1008 21:04:34.713734 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 21:04:34 crc kubenswrapper[4669]: W1008 21:04:34.717923 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e35f189_cd14_4892_a4d6_25a23a2ae04c.slice/crio-c53f8d8b6aa87e30f29a88123116933f11a5de37d5ef069bc8c1424718fb32d3 WatchSource:0}: Error finding container c53f8d8b6aa87e30f29a88123116933f11a5de37d5ef069bc8c1424718fb32d3: Status 404 returned error can't find the container with id c53f8d8b6aa87e30f29a88123116933f11a5de37d5ef069bc8c1424718fb32d3 Oct 08 21:04:35 crc kubenswrapper[4669]: I1008 21:04:35.341605 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f648df-77f7-4480-8b46-3f776880db17" path="/var/lib/kubelet/pods/b4f648df-77f7-4480-8b46-3f776880db17/volumes" Oct 08 21:04:35 crc kubenswrapper[4669]: I1008 21:04:35.552303 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0bfeeb02-715e-4358-802c-ce7ed6721a30","Type":"ContainerStarted","Data":"8dd67041eccd39e6c9d7389e40108803469b81b881a6c2f349674dc7694cc7f1"} Oct 08 21:04:35 crc kubenswrapper[4669]: I1008 21:04:35.556725 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e35f189-cd14-4892-a4d6-25a23a2ae04c","Type":"ContainerStarted","Data":"c53f8d8b6aa87e30f29a88123116933f11a5de37d5ef069bc8c1424718fb32d3"} Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.225511 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-ctfws"] Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.228200 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.229746 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.236407 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-ctfws"] Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381358 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-config\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381437 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxcm\" (UniqueName: \"kubernetes.io/projected/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-kube-api-access-4cxcm\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381461 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381502 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381553 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381797 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.381964 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.483957 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.484335 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.484433 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-config\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.484515 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxcm\" (UniqueName: \"kubernetes.io/projected/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-kube-api-access-4cxcm\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.484557 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.484610 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.484657 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.485131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.485435 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.485642 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.486352 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.486635 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-config\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.486892 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.509064 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxcm\" (UniqueName: \"kubernetes.io/projected/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-kube-api-access-4cxcm\") pod \"dnsmasq-dns-5576978c7c-ctfws\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.557001 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:36 crc kubenswrapper[4669]: I1008 21:04:36.567402 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e35f189-cd14-4892-a4d6-25a23a2ae04c","Type":"ContainerStarted","Data":"73ab5dd69026545de8539c60efb04267461aab73798217b184138a48ebe9a17e"} Oct 08 21:04:37 crc kubenswrapper[4669]: I1008 21:04:37.036200 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-ctfws"] Oct 08 21:04:37 crc kubenswrapper[4669]: I1008 21:04:37.580701 4669 generic.go:334] "Generic (PLEG): container finished" podID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerID="8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504" exitCode=0 Oct 08 21:04:37 crc kubenswrapper[4669]: I1008 21:04:37.580757 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" event={"ID":"bcd07ca1-05cc-45d9-b173-864dedb0f6bd","Type":"ContainerDied","Data":"8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504"} Oct 08 21:04:37 crc kubenswrapper[4669]: I1008 21:04:37.581307 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" event={"ID":"bcd07ca1-05cc-45d9-b173-864dedb0f6bd","Type":"ContainerStarted","Data":"8af645e4ba3c961cc05c16d6580dfff50c15fbbb747ecd64339fad16eb9394e1"} Oct 08 21:04:38 crc kubenswrapper[4669]: I1008 21:04:38.595679 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" event={"ID":"bcd07ca1-05cc-45d9-b173-864dedb0f6bd","Type":"ContainerStarted","Data":"cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519"} Oct 08 21:04:38 crc kubenswrapper[4669]: I1008 21:04:38.596033 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:38 crc kubenswrapper[4669]: I1008 21:04:38.631726 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" podStartSLOduration=2.631707021 podStartE2EDuration="2.631707021s" podCreationTimestamp="2025-10-08 21:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:04:38.623880108 +0000 UTC m=+1198.316690821" watchObservedRunningTime="2025-10-08 21:04:38.631707021 +0000 UTC m=+1198.324517694" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.558763 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.641679 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-vtsv5"] Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.642010 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="dnsmasq-dns" containerID="cri-o://7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe" gracePeriod=10 Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.800276 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-czgxg"] Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.801877 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.808352 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-czgxg"] Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.921869 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.921904 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.921950 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-config\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.921975 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.922130 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9wq\" (UniqueName: \"kubernetes.io/projected/2626e058-7115-4198-91ab-19e6f98dfc89-kube-api-access-rt9wq\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.922427 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:46 crc kubenswrapper[4669]: I1008 21:04:46.922481 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024716 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024764 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024843 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024889 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024931 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-config\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024957 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.024988 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9wq\" (UniqueName: \"kubernetes.io/projected/2626e058-7115-4198-91ab-19e6f98dfc89-kube-api-access-rt9wq\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.025511 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.025772 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.025771 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.025925 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-config\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.026097 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.027139 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2626e058-7115-4198-91ab-19e6f98dfc89-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.048578 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9wq\" (UniqueName: \"kubernetes.io/projected/2626e058-7115-4198-91ab-19e6f98dfc89-kube-api-access-rt9wq\") pod \"dnsmasq-dns-8c6f6df99-czgxg\" (UID: \"2626e058-7115-4198-91ab-19e6f98dfc89\") " pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.118065 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.316391 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.495035 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-config\") pod \"052a11d2-f410-40af-a1e8-a61ba3203811\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.495075 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9gwk\" (UniqueName: \"kubernetes.io/projected/052a11d2-f410-40af-a1e8-a61ba3203811-kube-api-access-b9gwk\") pod \"052a11d2-f410-40af-a1e8-a61ba3203811\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.495103 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-sb\") pod \"052a11d2-f410-40af-a1e8-a61ba3203811\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.495850 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-swift-storage-0\") pod \"052a11d2-f410-40af-a1e8-a61ba3203811\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.495911 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-svc\") pod \"052a11d2-f410-40af-a1e8-a61ba3203811\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.495935 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-nb\") pod \"052a11d2-f410-40af-a1e8-a61ba3203811\" (UID: \"052a11d2-f410-40af-a1e8-a61ba3203811\") " Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.502157 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052a11d2-f410-40af-a1e8-a61ba3203811-kube-api-access-b9gwk" (OuterVolumeSpecName: "kube-api-access-b9gwk") pod "052a11d2-f410-40af-a1e8-a61ba3203811" (UID: "052a11d2-f410-40af-a1e8-a61ba3203811"). InnerVolumeSpecName "kube-api-access-b9gwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.551112 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "052a11d2-f410-40af-a1e8-a61ba3203811" (UID: "052a11d2-f410-40af-a1e8-a61ba3203811"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.553345 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "052a11d2-f410-40af-a1e8-a61ba3203811" (UID: "052a11d2-f410-40af-a1e8-a61ba3203811"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.556015 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "052a11d2-f410-40af-a1e8-a61ba3203811" (UID: "052a11d2-f410-40af-a1e8-a61ba3203811"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.570726 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "052a11d2-f410-40af-a1e8-a61ba3203811" (UID: "052a11d2-f410-40af-a1e8-a61ba3203811"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.576091 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-config" (OuterVolumeSpecName: "config") pod "052a11d2-f410-40af-a1e8-a61ba3203811" (UID: "052a11d2-f410-40af-a1e8-a61ba3203811"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.598125 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.598153 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9gwk\" (UniqueName: \"kubernetes.io/projected/052a11d2-f410-40af-a1e8-a61ba3203811-kube-api-access-b9gwk\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.598165 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.598175 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.598186 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.598196 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052a11d2-f410-40af-a1e8-a61ba3203811-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.715374 4669 generic.go:334] "Generic (PLEG): container finished" podID="052a11d2-f410-40af-a1e8-a61ba3203811" containerID="7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe" exitCode=0 Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.715418 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" event={"ID":"052a11d2-f410-40af-a1e8-a61ba3203811","Type":"ContainerDied","Data":"7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe"} Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.715447 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" event={"ID":"052a11d2-f410-40af-a1e8-a61ba3203811","Type":"ContainerDied","Data":"4a4036d6c2065510b1a323407764d7279a1180bc1970d7fde7a4b19627229a24"} Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.715471 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.715475 4669 scope.go:117] "RemoveContainer" containerID="7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.755700 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-vtsv5"] Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.764945 4669 scope.go:117] "RemoveContainer" containerID="a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.766648 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-vtsv5"] Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.821433 4669 scope.go:117] "RemoveContainer" containerID="7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe" Oct 08 21:04:47 crc kubenswrapper[4669]: E1008 21:04:47.826158 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe\": container with ID starting with 7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe not found: ID does not exist" containerID="7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.826226 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe"} err="failed to get container status \"7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe\": rpc error: code = NotFound desc = could not find container \"7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe\": container with ID starting with 7f488c53f2ccdd32c5afc8349ffe5b921ff99e2bd3e64e510d07a92b55b65afe not found: ID does not exist" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.826270 4669 scope.go:117] "RemoveContainer" containerID="a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4" Oct 08 21:04:47 crc kubenswrapper[4669]: E1008 21:04:47.826810 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4\": container with ID starting with a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4 not found: ID does not exist" containerID="a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.826869 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4"} err="failed to get container status \"a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4\": rpc error: code = NotFound desc = could not find container \"a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4\": container with ID starting with a685143e6189efb0633803e8231b2dce66a98f35913917e460885a85175ddda4 not found: ID does not exist" Oct 08 21:04:47 crc kubenswrapper[4669]: I1008 21:04:47.946053 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-czgxg"] Oct 08 21:04:48 crc kubenswrapper[4669]: I1008 21:04:48.730409 4669 generic.go:334] "Generic (PLEG): container finished" podID="2626e058-7115-4198-91ab-19e6f98dfc89" containerID="1e3b2c5b33477dd9d71273a58a9897513c73443203cc95ee15003129efbd64fd" exitCode=0 Oct 08 21:04:48 crc kubenswrapper[4669]: I1008 21:04:48.730575 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" event={"ID":"2626e058-7115-4198-91ab-19e6f98dfc89","Type":"ContainerDied","Data":"1e3b2c5b33477dd9d71273a58a9897513c73443203cc95ee15003129efbd64fd"} Oct 08 21:04:48 crc kubenswrapper[4669]: I1008 21:04:48.730819 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" event={"ID":"2626e058-7115-4198-91ab-19e6f98dfc89","Type":"ContainerStarted","Data":"1fcaea4f26ddec75d5032fa56c6ea7734ed9835e494a33e6a97d1ec63623fa11"} Oct 08 21:04:49 crc kubenswrapper[4669]: I1008 21:04:49.353295 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" path="/var/lib/kubelet/pods/052a11d2-f410-40af-a1e8-a61ba3203811/volumes" Oct 08 21:04:49 crc kubenswrapper[4669]: I1008 21:04:49.752373 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" event={"ID":"2626e058-7115-4198-91ab-19e6f98dfc89","Type":"ContainerStarted","Data":"eef83e3c5138eef38cc86094de8b6cdbaa44ef831d49c34f253c82ed69792229"} Oct 08 21:04:49 crc kubenswrapper[4669]: I1008 21:04:49.752615 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:49 crc kubenswrapper[4669]: I1008 21:04:49.779431 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" podStartSLOduration=3.779410294 podStartE2EDuration="3.779410294s" podCreationTimestamp="2025-10-08 21:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:04:49.775324822 +0000 UTC m=+1209.468135535" watchObservedRunningTime="2025-10-08 21:04:49.779410294 +0000 UTC m=+1209.472220977" Oct 08 21:04:51 crc kubenswrapper[4669]: I1008 21:04:51.890789 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c7b6c5df9-vtsv5" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: i/o timeout" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.122811 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-czgxg" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.183473 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-ctfws"] Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.183709 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerName="dnsmasq-dns" containerID="cri-o://cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519" gracePeriod=10 Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.788046 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.839306 4669 generic.go:334] "Generic (PLEG): container finished" podID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerID="cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519" exitCode=0 Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.839361 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" event={"ID":"bcd07ca1-05cc-45d9-b173-864dedb0f6bd","Type":"ContainerDied","Data":"cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519"} Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.839397 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" event={"ID":"bcd07ca1-05cc-45d9-b173-864dedb0f6bd","Type":"ContainerDied","Data":"8af645e4ba3c961cc05c16d6580dfff50c15fbbb747ecd64339fad16eb9394e1"} Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.839418 4669 scope.go:117] "RemoveContainer" containerID="cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.839350 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-ctfws" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.895283 4669 scope.go:117] "RemoveContainer" containerID="8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.916509 4669 scope.go:117] "RemoveContainer" containerID="cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519" Oct 08 21:04:57 crc kubenswrapper[4669]: E1008 21:04:57.917305 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519\": container with ID starting with cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519 not found: ID does not exist" containerID="cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.917363 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519"} err="failed to get container status \"cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519\": rpc error: code = NotFound desc = could not find container \"cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519\": container with ID starting with cc6d016e6f6a40632d7e0e55f4c078368d8a1ea771bae810fa75d044e36c8519 not found: ID does not exist" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.917399 4669 scope.go:117] "RemoveContainer" containerID="8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504" Oct 08 21:04:57 crc kubenswrapper[4669]: E1008 21:04:57.917866 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504\": container with ID starting with 8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504 not found: ID does not exist" containerID="8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.917899 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504"} err="failed to get container status \"8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504\": rpc error: code = NotFound desc = could not find container \"8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504\": container with ID starting with 8fd9a8d33cef85ebfa3672a30a8bf76d3990febbd778549ffae5ebe2eb57b504 not found: ID does not exist" Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.953302 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-svc\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.954457 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-swift-storage-0\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.954796 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cxcm\" (UniqueName: \"kubernetes.io/projected/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-kube-api-access-4cxcm\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.954970 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-config\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.955072 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-nb\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.955168 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-openstack-edpm-ipam\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.955292 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-sb\") pod \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\" (UID: \"bcd07ca1-05cc-45d9-b173-864dedb0f6bd\") " Oct 08 21:04:57 crc kubenswrapper[4669]: I1008 21:04:57.960014 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-kube-api-access-4cxcm" (OuterVolumeSpecName: "kube-api-access-4cxcm") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "kube-api-access-4cxcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.010524 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-config" (OuterVolumeSpecName: "config") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.012676 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.016604 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.017542 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.036350 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.052240 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bcd07ca1-05cc-45d9-b173-864dedb0f6bd" (UID: "bcd07ca1-05cc-45d9-b173-864dedb0f6bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057509 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057555 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057572 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cxcm\" (UniqueName: \"kubernetes.io/projected/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-kube-api-access-4cxcm\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057584 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057595 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057606 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.057618 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcd07ca1-05cc-45d9-b173-864dedb0f6bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.176830 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-ctfws"] Oct 08 21:04:58 crc kubenswrapper[4669]: I1008 21:04:58.185847 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-ctfws"] Oct 08 21:04:59 crc kubenswrapper[4669]: I1008 21:04:59.349789 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" path="/var/lib/kubelet/pods/bcd07ca1-05cc-45d9-b173-864dedb0f6bd/volumes" Oct 08 21:05:07 crc kubenswrapper[4669]: I1008 21:05:07.941323 4669 generic.go:334] "Generic (PLEG): container finished" podID="0bfeeb02-715e-4358-802c-ce7ed6721a30" containerID="8dd67041eccd39e6c9d7389e40108803469b81b881a6c2f349674dc7694cc7f1" exitCode=0 Oct 08 21:05:07 crc kubenswrapper[4669]: I1008 21:05:07.941428 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0bfeeb02-715e-4358-802c-ce7ed6721a30","Type":"ContainerDied","Data":"8dd67041eccd39e6c9d7389e40108803469b81b881a6c2f349674dc7694cc7f1"} Oct 08 21:05:08 crc kubenswrapper[4669]: I1008 21:05:08.957231 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0bfeeb02-715e-4358-802c-ce7ed6721a30","Type":"ContainerStarted","Data":"9a42be60971540cb3cdcc0bd777b57352f8615c979525dd18f66381669e04b6e"} Oct 08 21:05:08 crc kubenswrapper[4669]: I1008 21:05:08.958116 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 21:05:08 crc kubenswrapper[4669]: I1008 21:05:08.987586 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.98756211 podStartE2EDuration="36.98756211s" podCreationTimestamp="2025-10-08 21:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:05:08.981376301 +0000 UTC m=+1228.674186984" watchObservedRunningTime="2025-10-08 21:05:08.98756211 +0000 UTC m=+1228.680372783" Oct 08 21:05:09 crc kubenswrapper[4669]: I1008 21:05:09.967794 4669 generic.go:334] "Generic (PLEG): container finished" podID="8e35f189-cd14-4892-a4d6-25a23a2ae04c" containerID="73ab5dd69026545de8539c60efb04267461aab73798217b184138a48ebe9a17e" exitCode=0 Oct 08 21:05:09 crc kubenswrapper[4669]: I1008 21:05:09.967910 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e35f189-cd14-4892-a4d6-25a23a2ae04c","Type":"ContainerDied","Data":"73ab5dd69026545de8539c60efb04267461aab73798217b184138a48ebe9a17e"} Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.498768 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl"] Oct 08 21:05:10 crc kubenswrapper[4669]: E1008 21:05:10.500275 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerName="dnsmasq-dns" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.500294 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerName="dnsmasq-dns" Oct 08 21:05:10 crc kubenswrapper[4669]: E1008 21:05:10.500324 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="init" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.500330 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="init" Oct 08 21:05:10 crc kubenswrapper[4669]: E1008 21:05:10.500361 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="dnsmasq-dns" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.500369 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="dnsmasq-dns" Oct 08 21:05:10 crc kubenswrapper[4669]: E1008 21:05:10.500393 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerName="init" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.500399 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerName="init" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.501326 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="052a11d2-f410-40af-a1e8-a61ba3203811" containerName="dnsmasq-dns" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.501370 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd07ca1-05cc-45d9-b173-864dedb0f6bd" containerName="dnsmasq-dns" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.502390 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.525470 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.525610 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.525834 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.525990 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.545593 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl"] Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.596264 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smhbz\" (UniqueName: \"kubernetes.io/projected/0bff2321-3d96-47bf-815e-7ab3cea9563a-kube-api-access-smhbz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.596443 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.596501 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.596545 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.697910 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.697957 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.698045 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smhbz\" (UniqueName: \"kubernetes.io/projected/0bff2321-3d96-47bf-815e-7ab3cea9563a-kube-api-access-smhbz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.698122 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.702997 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.703069 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.714643 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.715144 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smhbz\" (UniqueName: \"kubernetes.io/projected/0bff2321-3d96-47bf-815e-7ab3cea9563a-kube-api-access-smhbz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.852289 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.981483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e35f189-cd14-4892-a4d6-25a23a2ae04c","Type":"ContainerStarted","Data":"f82aa2d9eb6bf7d3262d51e866267e2d702b2df5c3d36fc9f766451fb2010d50"} Oct 08 21:05:10 crc kubenswrapper[4669]: I1008 21:05:10.983398 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:05:11 crc kubenswrapper[4669]: I1008 21:05:11.014562 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.014517192 podStartE2EDuration="38.014517192s" podCreationTimestamp="2025-10-08 21:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:05:11.01004783 +0000 UTC m=+1230.702858513" watchObservedRunningTime="2025-10-08 21:05:11.014517192 +0000 UTC m=+1230.707327865" Oct 08 21:05:11 crc kubenswrapper[4669]: W1008 21:05:11.430753 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bff2321_3d96_47bf_815e_7ab3cea9563a.slice/crio-9c4dd992b0566246e6390da81d8a3ff2dafe7e5a6774b7653c72e22f38155e7e WatchSource:0}: Error finding container 9c4dd992b0566246e6390da81d8a3ff2dafe7e5a6774b7653c72e22f38155e7e: Status 404 returned error can't find the container with id 9c4dd992b0566246e6390da81d8a3ff2dafe7e5a6774b7653c72e22f38155e7e Oct 08 21:05:11 crc kubenswrapper[4669]: I1008 21:05:11.431152 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl"] Oct 08 21:05:11 crc kubenswrapper[4669]: I1008 21:05:11.434480 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:05:11 crc kubenswrapper[4669]: I1008 21:05:11.995576 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" event={"ID":"0bff2321-3d96-47bf-815e-7ab3cea9563a","Type":"ContainerStarted","Data":"9c4dd992b0566246e6390da81d8a3ff2dafe7e5a6774b7653c72e22f38155e7e"} Oct 08 21:05:13 crc kubenswrapper[4669]: I1008 21:05:13.185674 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:05:13 crc kubenswrapper[4669]: I1008 21:05:13.185910 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:05:21 crc kubenswrapper[4669]: I1008 21:05:21.127150 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" event={"ID":"0bff2321-3d96-47bf-815e-7ab3cea9563a","Type":"ContainerStarted","Data":"4d6bc29d34f292aeeff660a66c8ddd8ad05e3787341a9f4eb86c168e2d308686"} Oct 08 21:05:21 crc kubenswrapper[4669]: I1008 21:05:21.170603 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" podStartSLOduration=2.553636129 podStartE2EDuration="11.170576597s" podCreationTimestamp="2025-10-08 21:05:10 +0000 UTC" firstStartedPulling="2025-10-08 21:05:11.43414783 +0000 UTC m=+1231.126958503" lastFinishedPulling="2025-10-08 21:05:20.051088288 +0000 UTC m=+1239.743898971" observedRunningTime="2025-10-08 21:05:21.154018516 +0000 UTC m=+1240.846829249" watchObservedRunningTime="2025-10-08 21:05:21.170576597 +0000 UTC m=+1240.863387300" Oct 08 21:05:23 crc kubenswrapper[4669]: I1008 21:05:23.301782 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 21:05:24 crc kubenswrapper[4669]: I1008 21:05:24.258789 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 21:05:32 crc kubenswrapper[4669]: I1008 21:05:32.221682 4669 generic.go:334] "Generic (PLEG): container finished" podID="0bff2321-3d96-47bf-815e-7ab3cea9563a" containerID="4d6bc29d34f292aeeff660a66c8ddd8ad05e3787341a9f4eb86c168e2d308686" exitCode=0 Oct 08 21:05:32 crc kubenswrapper[4669]: I1008 21:05:32.221788 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" event={"ID":"0bff2321-3d96-47bf-815e-7ab3cea9563a","Type":"ContainerDied","Data":"4d6bc29d34f292aeeff660a66c8ddd8ad05e3787341a9f4eb86c168e2d308686"} Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.633831 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.743981 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-ssh-key\") pod \"0bff2321-3d96-47bf-815e-7ab3cea9563a\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.744068 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-inventory\") pod \"0bff2321-3d96-47bf-815e-7ab3cea9563a\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.744200 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-repo-setup-combined-ca-bundle\") pod \"0bff2321-3d96-47bf-815e-7ab3cea9563a\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.744262 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smhbz\" (UniqueName: \"kubernetes.io/projected/0bff2321-3d96-47bf-815e-7ab3cea9563a-kube-api-access-smhbz\") pod \"0bff2321-3d96-47bf-815e-7ab3cea9563a\" (UID: \"0bff2321-3d96-47bf-815e-7ab3cea9563a\") " Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.752233 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0bff2321-3d96-47bf-815e-7ab3cea9563a" (UID: "0bff2321-3d96-47bf-815e-7ab3cea9563a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.765030 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bff2321-3d96-47bf-815e-7ab3cea9563a-kube-api-access-smhbz" (OuterVolumeSpecName: "kube-api-access-smhbz") pod "0bff2321-3d96-47bf-815e-7ab3cea9563a" (UID: "0bff2321-3d96-47bf-815e-7ab3cea9563a"). InnerVolumeSpecName "kube-api-access-smhbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.779516 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-inventory" (OuterVolumeSpecName: "inventory") pod "0bff2321-3d96-47bf-815e-7ab3cea9563a" (UID: "0bff2321-3d96-47bf-815e-7ab3cea9563a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.807734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0bff2321-3d96-47bf-815e-7ab3cea9563a" (UID: "0bff2321-3d96-47bf-815e-7ab3cea9563a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.846160 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.846205 4669 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.846224 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smhbz\" (UniqueName: \"kubernetes.io/projected/0bff2321-3d96-47bf-815e-7ab3cea9563a-kube-api-access-smhbz\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:33 crc kubenswrapper[4669]: I1008 21:05:33.846239 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bff2321-3d96-47bf-815e-7ab3cea9563a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.239813 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" event={"ID":"0bff2321-3d96-47bf-815e-7ab3cea9563a","Type":"ContainerDied","Data":"9c4dd992b0566246e6390da81d8a3ff2dafe7e5a6774b7653c72e22f38155e7e"} Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.239850 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c4dd992b0566246e6390da81d8a3ff2dafe7e5a6774b7653c72e22f38155e7e" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.239906 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.318096 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4"] Oct 08 21:05:34 crc kubenswrapper[4669]: E1008 21:05:34.318573 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bff2321-3d96-47bf-815e-7ab3cea9563a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.318597 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bff2321-3d96-47bf-815e-7ab3cea9563a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.318841 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bff2321-3d96-47bf-815e-7ab3cea9563a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.319565 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.322579 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.322766 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.323710 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.324191 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.335064 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4"] Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.355678 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.355989 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.356115 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zq24\" (UniqueName: \"kubernetes.io/projected/405955bf-c08c-4afd-9720-41adf4bebd19-kube-api-access-7zq24\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.458506 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.458755 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.458804 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zq24\" (UniqueName: \"kubernetes.io/projected/405955bf-c08c-4afd-9720-41adf4bebd19-kube-api-access-7zq24\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.463905 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.465469 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.485724 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zq24\" (UniqueName: \"kubernetes.io/projected/405955bf-c08c-4afd-9720-41adf4bebd19-kube-api-access-7zq24\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-b5rp4\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:34 crc kubenswrapper[4669]: I1008 21:05:34.647405 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:35 crc kubenswrapper[4669]: I1008 21:05:35.012731 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4"] Oct 08 21:05:35 crc kubenswrapper[4669]: W1008 21:05:35.024738 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod405955bf_c08c_4afd_9720_41adf4bebd19.slice/crio-ce36500ae509f41cb3c228e38c88fee623aedc7e61c583a3993a781a4994fc17 WatchSource:0}: Error finding container ce36500ae509f41cb3c228e38c88fee623aedc7e61c583a3993a781a4994fc17: Status 404 returned error can't find the container with id ce36500ae509f41cb3c228e38c88fee623aedc7e61c583a3993a781a4994fc17 Oct 08 21:05:35 crc kubenswrapper[4669]: I1008 21:05:35.251785 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" event={"ID":"405955bf-c08c-4afd-9720-41adf4bebd19","Type":"ContainerStarted","Data":"ce36500ae509f41cb3c228e38c88fee623aedc7e61c583a3993a781a4994fc17"} Oct 08 21:05:37 crc kubenswrapper[4669]: I1008 21:05:37.277442 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" event={"ID":"405955bf-c08c-4afd-9720-41adf4bebd19","Type":"ContainerStarted","Data":"4b44d9b103a92f04187a2706e19a602c4e96d31ba034d54bc995e12b17f1e48e"} Oct 08 21:05:37 crc kubenswrapper[4669]: I1008 21:05:37.301750 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" podStartSLOduration=2.292397056 podStartE2EDuration="3.301724635s" podCreationTimestamp="2025-10-08 21:05:34 +0000 UTC" firstStartedPulling="2025-10-08 21:05:35.026851916 +0000 UTC m=+1254.719662589" lastFinishedPulling="2025-10-08 21:05:36.036179485 +0000 UTC m=+1255.728990168" observedRunningTime="2025-10-08 21:05:37.298173337 +0000 UTC m=+1256.990984020" watchObservedRunningTime="2025-10-08 21:05:37.301724635 +0000 UTC m=+1256.994535318" Oct 08 21:05:39 crc kubenswrapper[4669]: I1008 21:05:39.301983 4669 generic.go:334] "Generic (PLEG): container finished" podID="405955bf-c08c-4afd-9720-41adf4bebd19" containerID="4b44d9b103a92f04187a2706e19a602c4e96d31ba034d54bc995e12b17f1e48e" exitCode=0 Oct 08 21:05:39 crc kubenswrapper[4669]: I1008 21:05:39.302079 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" event={"ID":"405955bf-c08c-4afd-9720-41adf4bebd19","Type":"ContainerDied","Data":"4b44d9b103a92f04187a2706e19a602c4e96d31ba034d54bc995e12b17f1e48e"} Oct 08 21:05:40 crc kubenswrapper[4669]: I1008 21:05:40.800088 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:40 crc kubenswrapper[4669]: I1008 21:05:40.990662 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-inventory\") pod \"405955bf-c08c-4afd-9720-41adf4bebd19\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " Oct 08 21:05:40 crc kubenswrapper[4669]: I1008 21:05:40.990944 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-ssh-key\") pod \"405955bf-c08c-4afd-9720-41adf4bebd19\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " Oct 08 21:05:40 crc kubenswrapper[4669]: I1008 21:05:40.991076 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zq24\" (UniqueName: \"kubernetes.io/projected/405955bf-c08c-4afd-9720-41adf4bebd19-kube-api-access-7zq24\") pod \"405955bf-c08c-4afd-9720-41adf4bebd19\" (UID: \"405955bf-c08c-4afd-9720-41adf4bebd19\") " Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.005745 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405955bf-c08c-4afd-9720-41adf4bebd19-kube-api-access-7zq24" (OuterVolumeSpecName: "kube-api-access-7zq24") pod "405955bf-c08c-4afd-9720-41adf4bebd19" (UID: "405955bf-c08c-4afd-9720-41adf4bebd19"). InnerVolumeSpecName "kube-api-access-7zq24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.027187 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "405955bf-c08c-4afd-9720-41adf4bebd19" (UID: "405955bf-c08c-4afd-9720-41adf4bebd19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.037071 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-inventory" (OuterVolumeSpecName: "inventory") pod "405955bf-c08c-4afd-9720-41adf4bebd19" (UID: "405955bf-c08c-4afd-9720-41adf4bebd19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.093514 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.093578 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zq24\" (UniqueName: \"kubernetes.io/projected/405955bf-c08c-4afd-9720-41adf4bebd19-kube-api-access-7zq24\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.093593 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/405955bf-c08c-4afd-9720-41adf4bebd19-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.326659 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" event={"ID":"405955bf-c08c-4afd-9720-41adf4bebd19","Type":"ContainerDied","Data":"ce36500ae509f41cb3c228e38c88fee623aedc7e61c583a3993a781a4994fc17"} Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.326701 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce36500ae509f41cb3c228e38c88fee623aedc7e61c583a3993a781a4994fc17" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.326725 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-b5rp4" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.398043 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr"] Oct 08 21:05:41 crc kubenswrapper[4669]: E1008 21:05:41.399726 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405955bf-c08c-4afd-9720-41adf4bebd19" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.399753 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="405955bf-c08c-4afd-9720-41adf4bebd19" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.399964 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="405955bf-c08c-4afd-9720-41adf4bebd19" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.400630 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.404156 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.404751 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.404100 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.405001 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.431733 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr"] Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.500768 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llclc\" (UniqueName: \"kubernetes.io/projected/d54b9af7-032e-4b63-ada5-0cebab9e052d-kube-api-access-llclc\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.500972 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.501027 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.501073 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.602406 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llclc\" (UniqueName: \"kubernetes.io/projected/d54b9af7-032e-4b63-ada5-0cebab9e052d-kube-api-access-llclc\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.602479 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.602500 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.602519 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.606248 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.608075 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.611978 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.628830 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llclc\" (UniqueName: \"kubernetes.io/projected/d54b9af7-032e-4b63-ada5-0cebab9e052d-kube-api-access-llclc\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:41 crc kubenswrapper[4669]: I1008 21:05:41.724977 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:05:42 crc kubenswrapper[4669]: I1008 21:05:42.263280 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr"] Oct 08 21:05:42 crc kubenswrapper[4669]: I1008 21:05:42.343910 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" event={"ID":"d54b9af7-032e-4b63-ada5-0cebab9e052d","Type":"ContainerStarted","Data":"3d95453f94050bba7257bce9f1de064b04bb110f613399483e72503e21161462"} Oct 08 21:05:43 crc kubenswrapper[4669]: I1008 21:05:43.185727 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:05:43 crc kubenswrapper[4669]: I1008 21:05:43.186060 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:05:43 crc kubenswrapper[4669]: I1008 21:05:43.355582 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" event={"ID":"d54b9af7-032e-4b63-ada5-0cebab9e052d","Type":"ContainerStarted","Data":"39b28d758ce579aa8a57a854cfd1862dc80b74574bd60841917cb1659a7a1831"} Oct 08 21:05:43 crc kubenswrapper[4669]: I1008 21:05:43.380874 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" podStartSLOduration=1.860917492 podStartE2EDuration="2.380837657s" podCreationTimestamp="2025-10-08 21:05:41 +0000 UTC" firstStartedPulling="2025-10-08 21:05:42.268174832 +0000 UTC m=+1261.960985505" lastFinishedPulling="2025-10-08 21:05:42.788094957 +0000 UTC m=+1262.480905670" observedRunningTime="2025-10-08 21:05:43.37249631 +0000 UTC m=+1263.065307023" watchObservedRunningTime="2025-10-08 21:05:43.380837657 +0000 UTC m=+1263.073648360" Oct 08 21:05:56 crc kubenswrapper[4669]: I1008 21:05:56.510992 4669 scope.go:117] "RemoveContainer" containerID="c692662c852fdc96a22525d90a54f4c9658b9b0948b31cb34b7af6b3b8d42eb0" Oct 08 21:05:56 crc kubenswrapper[4669]: I1008 21:05:56.536376 4669 scope.go:117] "RemoveContainer" containerID="88b3c6fbf0c3c8a46df051f8123f25ba55c6300188ce4a9b648e554445b4a09b" Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.185951 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.186729 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.186807 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.187912 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d9c1843fe5022c993d21347e4a9434b43afdaabe05d722d7ab0e85541c9821e"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.188008 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://7d9c1843fe5022c993d21347e4a9434b43afdaabe05d722d7ab0e85541c9821e" gracePeriod=600 Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.728680 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="7d9c1843fe5022c993d21347e4a9434b43afdaabe05d722d7ab0e85541c9821e" exitCode=0 Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.728757 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"7d9c1843fe5022c993d21347e4a9434b43afdaabe05d722d7ab0e85541c9821e"} Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.729070 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074"} Oct 08 21:06:13 crc kubenswrapper[4669]: I1008 21:06:13.729103 4669 scope.go:117] "RemoveContainer" containerID="fc8abec09504bb79a99269d94867e82d2072a920f3270921c1c4a731ac29aaaf" Oct 08 21:06:56 crc kubenswrapper[4669]: I1008 21:06:56.602191 4669 scope.go:117] "RemoveContainer" containerID="0df0a205510b9f2322e2df5512943b74d26c823c6537d133620954f20dce5aaa" Oct 08 21:06:56 crc kubenswrapper[4669]: I1008 21:06:56.641675 4669 scope.go:117] "RemoveContainer" containerID="fa86118dfbbefa110fa99cfc57c08478773cb2edc4caf210613dda42aaf58e99" Oct 08 21:06:56 crc kubenswrapper[4669]: I1008 21:06:56.719241 4669 scope.go:117] "RemoveContainer" containerID="c87ef135ae1c63d7bce5bcc12b59351fee15690078ed663152d548cfed0c0be0" Oct 08 21:08:13 crc kubenswrapper[4669]: I1008 21:08:13.185039 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:08:13 crc kubenswrapper[4669]: I1008 21:08:13.185622 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.615030 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9ffl4"] Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.617569 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.627381 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ffl4"] Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.671096 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqll\" (UniqueName: \"kubernetes.io/projected/20e4f24b-72db-4677-b790-c7537d907711-kube-api-access-wmqll\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.671196 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-catalog-content\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.671221 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-utilities\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.777433 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqll\" (UniqueName: \"kubernetes.io/projected/20e4f24b-72db-4677-b790-c7537d907711-kube-api-access-wmqll\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.777562 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-catalog-content\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.777600 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-utilities\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.778332 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-utilities\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.779062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-catalog-content\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.802511 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqll\" (UniqueName: \"kubernetes.io/projected/20e4f24b-72db-4677-b790-c7537d907711-kube-api-access-wmqll\") pod \"community-operators-9ffl4\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:18 crc kubenswrapper[4669]: I1008 21:08:18.945690 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:19 crc kubenswrapper[4669]: I1008 21:08:19.315453 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ffl4"] Oct 08 21:08:20 crc kubenswrapper[4669]: I1008 21:08:20.109773 4669 generic.go:334] "Generic (PLEG): container finished" podID="20e4f24b-72db-4677-b790-c7537d907711" containerID="8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908" exitCode=0 Oct 08 21:08:20 crc kubenswrapper[4669]: I1008 21:08:20.109836 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ffl4" event={"ID":"20e4f24b-72db-4677-b790-c7537d907711","Type":"ContainerDied","Data":"8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908"} Oct 08 21:08:20 crc kubenswrapper[4669]: I1008 21:08:20.110055 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ffl4" event={"ID":"20e4f24b-72db-4677-b790-c7537d907711","Type":"ContainerStarted","Data":"0e6235b2a12733e3676129a4db3edf77484b71d08864edc92e31a78d2264ed83"} Oct 08 21:08:23 crc kubenswrapper[4669]: I1008 21:08:23.145118 4669 generic.go:334] "Generic (PLEG): container finished" podID="20e4f24b-72db-4677-b790-c7537d907711" containerID="a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6" exitCode=0 Oct 08 21:08:23 crc kubenswrapper[4669]: I1008 21:08:23.145175 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ffl4" event={"ID":"20e4f24b-72db-4677-b790-c7537d907711","Type":"ContainerDied","Data":"a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6"} Oct 08 21:08:25 crc kubenswrapper[4669]: I1008 21:08:25.166188 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ffl4" event={"ID":"20e4f24b-72db-4677-b790-c7537d907711","Type":"ContainerStarted","Data":"987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a"} Oct 08 21:08:25 crc kubenswrapper[4669]: I1008 21:08:25.186767 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9ffl4" podStartSLOduration=3.251587669 podStartE2EDuration="7.18674943s" podCreationTimestamp="2025-10-08 21:08:18 +0000 UTC" firstStartedPulling="2025-10-08 21:08:20.113623653 +0000 UTC m=+1419.806434336" lastFinishedPulling="2025-10-08 21:08:24.048785424 +0000 UTC m=+1423.741596097" observedRunningTime="2025-10-08 21:08:25.183666935 +0000 UTC m=+1424.876477618" watchObservedRunningTime="2025-10-08 21:08:25.18674943 +0000 UTC m=+1424.879560103" Oct 08 21:08:28 crc kubenswrapper[4669]: I1008 21:08:28.946809 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:28 crc kubenswrapper[4669]: I1008 21:08:28.947369 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:28 crc kubenswrapper[4669]: I1008 21:08:28.997256 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:29 crc kubenswrapper[4669]: I1008 21:08:29.275419 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:29 crc kubenswrapper[4669]: I1008 21:08:29.330045 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ffl4"] Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.237649 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9ffl4" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="registry-server" containerID="cri-o://987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a" gracePeriod=2 Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.697704 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.868735 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-utilities\") pod \"20e4f24b-72db-4677-b790-c7537d907711\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.868811 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmqll\" (UniqueName: \"kubernetes.io/projected/20e4f24b-72db-4677-b790-c7537d907711-kube-api-access-wmqll\") pod \"20e4f24b-72db-4677-b790-c7537d907711\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.868942 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-catalog-content\") pod \"20e4f24b-72db-4677-b790-c7537d907711\" (UID: \"20e4f24b-72db-4677-b790-c7537d907711\") " Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.869790 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-utilities" (OuterVolumeSpecName: "utilities") pod "20e4f24b-72db-4677-b790-c7537d907711" (UID: "20e4f24b-72db-4677-b790-c7537d907711"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.877858 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e4f24b-72db-4677-b790-c7537d907711-kube-api-access-wmqll" (OuterVolumeSpecName: "kube-api-access-wmqll") pod "20e4f24b-72db-4677-b790-c7537d907711" (UID: "20e4f24b-72db-4677-b790-c7537d907711"). InnerVolumeSpecName "kube-api-access-wmqll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.933996 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20e4f24b-72db-4677-b790-c7537d907711" (UID: "20e4f24b-72db-4677-b790-c7537d907711"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.970987 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.971039 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmqll\" (UniqueName: \"kubernetes.io/projected/20e4f24b-72db-4677-b790-c7537d907711-kube-api-access-wmqll\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:31 crc kubenswrapper[4669]: I1008 21:08:31.971056 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4f24b-72db-4677-b790-c7537d907711-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.247178 4669 generic.go:334] "Generic (PLEG): container finished" podID="20e4f24b-72db-4677-b790-c7537d907711" containerID="987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a" exitCode=0 Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.247209 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ffl4" event={"ID":"20e4f24b-72db-4677-b790-c7537d907711","Type":"ContainerDied","Data":"987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a"} Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.247253 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ffl4" event={"ID":"20e4f24b-72db-4677-b790-c7537d907711","Type":"ContainerDied","Data":"0e6235b2a12733e3676129a4db3edf77484b71d08864edc92e31a78d2264ed83"} Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.247271 4669 scope.go:117] "RemoveContainer" containerID="987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.247313 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ffl4" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.280435 4669 scope.go:117] "RemoveContainer" containerID="a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.282235 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ffl4"] Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.291812 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9ffl4"] Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.304995 4669 scope.go:117] "RemoveContainer" containerID="8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.361742 4669 scope.go:117] "RemoveContainer" containerID="987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a" Oct 08 21:08:32 crc kubenswrapper[4669]: E1008 21:08:32.362269 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a\": container with ID starting with 987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a not found: ID does not exist" containerID="987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.362305 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a"} err="failed to get container status \"987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a\": rpc error: code = NotFound desc = could not find container \"987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a\": container with ID starting with 987c801fc861e3db1e23f77cb27a51946ef1d6d3c14c0eff0dc1e93fdcb8135a not found: ID does not exist" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.362332 4669 scope.go:117] "RemoveContainer" containerID="a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6" Oct 08 21:08:32 crc kubenswrapper[4669]: E1008 21:08:32.362944 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6\": container with ID starting with a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6 not found: ID does not exist" containerID="a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.362971 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6"} err="failed to get container status \"a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6\": rpc error: code = NotFound desc = could not find container \"a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6\": container with ID starting with a15c0c621d1097c729a17bcfd9185a9fbffaab3fa6c8462115f2ba51826afdd6 not found: ID does not exist" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.362988 4669 scope.go:117] "RemoveContainer" containerID="8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908" Oct 08 21:08:32 crc kubenswrapper[4669]: E1008 21:08:32.363306 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908\": container with ID starting with 8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908 not found: ID does not exist" containerID="8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908" Oct 08 21:08:32 crc kubenswrapper[4669]: I1008 21:08:32.363330 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908"} err="failed to get container status \"8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908\": rpc error: code = NotFound desc = could not find container \"8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908\": container with ID starting with 8e1c907aafd6521250417ecc9e62dadd1aa0dc3a4b8a0b04692352eb26bd2908 not found: ID does not exist" Oct 08 21:08:33 crc kubenswrapper[4669]: I1008 21:08:33.348016 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e4f24b-72db-4677-b790-c7537d907711" path="/var/lib/kubelet/pods/20e4f24b-72db-4677-b790-c7537d907711/volumes" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.917191 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2pfw6"] Oct 08 21:08:34 crc kubenswrapper[4669]: E1008 21:08:34.917950 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="extract-utilities" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.917963 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="extract-utilities" Oct 08 21:08:34 crc kubenswrapper[4669]: E1008 21:08:34.917979 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="registry-server" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.917985 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="registry-server" Oct 08 21:08:34 crc kubenswrapper[4669]: E1008 21:08:34.918001 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="extract-content" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.918008 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="extract-content" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.918179 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e4f24b-72db-4677-b790-c7537d907711" containerName="registry-server" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.919494 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.925749 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pfw6"] Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.931541 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-catalog-content\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.931630 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9r22\" (UniqueName: \"kubernetes.io/projected/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-kube-api-access-q9r22\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:34 crc kubenswrapper[4669]: I1008 21:08:34.931693 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-utilities\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.033628 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-catalog-content\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.033732 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9r22\" (UniqueName: \"kubernetes.io/projected/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-kube-api-access-q9r22\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.033799 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-utilities\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.034412 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-utilities\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.034584 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-catalog-content\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.058688 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9r22\" (UniqueName: \"kubernetes.io/projected/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-kube-api-access-q9r22\") pod \"certified-operators-2pfw6\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.245486 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:35 crc kubenswrapper[4669]: I1008 21:08:35.733191 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pfw6"] Oct 08 21:08:36 crc kubenswrapper[4669]: I1008 21:08:36.281802 4669 generic.go:334] "Generic (PLEG): container finished" podID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerID="30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b" exitCode=0 Oct 08 21:08:36 crc kubenswrapper[4669]: I1008 21:08:36.281964 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pfw6" event={"ID":"2d3acdee-2992-4fa2-bcb3-e1ae588325d1","Type":"ContainerDied","Data":"30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b"} Oct 08 21:08:36 crc kubenswrapper[4669]: I1008 21:08:36.282277 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pfw6" event={"ID":"2d3acdee-2992-4fa2-bcb3-e1ae588325d1","Type":"ContainerStarted","Data":"26433be5441cf562b1b13cec1dfb312c85634dd5dd11a3ab35e62d8a678661a1"} Oct 08 21:08:38 crc kubenswrapper[4669]: I1008 21:08:38.305209 4669 generic.go:334] "Generic (PLEG): container finished" podID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerID="a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5" exitCode=0 Oct 08 21:08:38 crc kubenswrapper[4669]: I1008 21:08:38.305282 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pfw6" event={"ID":"2d3acdee-2992-4fa2-bcb3-e1ae588325d1","Type":"ContainerDied","Data":"a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5"} Oct 08 21:08:39 crc kubenswrapper[4669]: I1008 21:08:39.318335 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pfw6" event={"ID":"2d3acdee-2992-4fa2-bcb3-e1ae588325d1","Type":"ContainerStarted","Data":"a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe"} Oct 08 21:08:39 crc kubenswrapper[4669]: I1008 21:08:39.351678 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2pfw6" podStartSLOduration=2.832515158 podStartE2EDuration="5.351658114s" podCreationTimestamp="2025-10-08 21:08:34 +0000 UTC" firstStartedPulling="2025-10-08 21:08:36.283621031 +0000 UTC m=+1435.976431704" lastFinishedPulling="2025-10-08 21:08:38.802763977 +0000 UTC m=+1438.495574660" observedRunningTime="2025-10-08 21:08:39.34384964 +0000 UTC m=+1439.036660323" watchObservedRunningTime="2025-10-08 21:08:39.351658114 +0000 UTC m=+1439.044468787" Oct 08 21:08:43 crc kubenswrapper[4669]: I1008 21:08:43.185392 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:08:43 crc kubenswrapper[4669]: I1008 21:08:43.186030 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:08:45 crc kubenswrapper[4669]: I1008 21:08:45.246238 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:45 crc kubenswrapper[4669]: I1008 21:08:45.246520 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:45 crc kubenswrapper[4669]: I1008 21:08:45.290847 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:45 crc kubenswrapper[4669]: I1008 21:08:45.453346 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:45 crc kubenswrapper[4669]: I1008 21:08:45.527199 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pfw6"] Oct 08 21:08:47 crc kubenswrapper[4669]: I1008 21:08:47.404110 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2pfw6" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="registry-server" containerID="cri-o://a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe" gracePeriod=2 Oct 08 21:08:47 crc kubenswrapper[4669]: I1008 21:08:47.876185 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:47 crc kubenswrapper[4669]: I1008 21:08:47.992214 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-catalog-content\") pod \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " Oct 08 21:08:47 crc kubenswrapper[4669]: I1008 21:08:47.992347 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-utilities\") pod \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " Oct 08 21:08:47 crc kubenswrapper[4669]: I1008 21:08:47.992618 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9r22\" (UniqueName: \"kubernetes.io/projected/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-kube-api-access-q9r22\") pod \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\" (UID: \"2d3acdee-2992-4fa2-bcb3-e1ae588325d1\") " Oct 08 21:08:47 crc kubenswrapper[4669]: I1008 21:08:47.993516 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-utilities" (OuterVolumeSpecName: "utilities") pod "2d3acdee-2992-4fa2-bcb3-e1ae588325d1" (UID: "2d3acdee-2992-4fa2-bcb3-e1ae588325d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.000261 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-kube-api-access-q9r22" (OuterVolumeSpecName: "kube-api-access-q9r22") pod "2d3acdee-2992-4fa2-bcb3-e1ae588325d1" (UID: "2d3acdee-2992-4fa2-bcb3-e1ae588325d1"). InnerVolumeSpecName "kube-api-access-q9r22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.094808 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9r22\" (UniqueName: \"kubernetes.io/projected/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-kube-api-access-q9r22\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.094847 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.272645 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d3acdee-2992-4fa2-bcb3-e1ae588325d1" (UID: "2d3acdee-2992-4fa2-bcb3-e1ae588325d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.299631 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3acdee-2992-4fa2-bcb3-e1ae588325d1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.418908 4669 generic.go:334] "Generic (PLEG): container finished" podID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerID="a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe" exitCode=0 Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.418948 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pfw6" event={"ID":"2d3acdee-2992-4fa2-bcb3-e1ae588325d1","Type":"ContainerDied","Data":"a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe"} Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.418976 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pfw6" event={"ID":"2d3acdee-2992-4fa2-bcb3-e1ae588325d1","Type":"ContainerDied","Data":"26433be5441cf562b1b13cec1dfb312c85634dd5dd11a3ab35e62d8a678661a1"} Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.418997 4669 scope.go:117] "RemoveContainer" containerID="a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.419074 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pfw6" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.453280 4669 scope.go:117] "RemoveContainer" containerID="a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.463852 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pfw6"] Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.473862 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2pfw6"] Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.477331 4669 scope.go:117] "RemoveContainer" containerID="30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.534583 4669 scope.go:117] "RemoveContainer" containerID="a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe" Oct 08 21:08:48 crc kubenswrapper[4669]: E1008 21:08:48.535063 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe\": container with ID starting with a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe not found: ID does not exist" containerID="a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.535124 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe"} err="failed to get container status \"a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe\": rpc error: code = NotFound desc = could not find container \"a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe\": container with ID starting with a4dae5239a042912763dec079f39c5ee7b336e9240b261878fd7e01756f38efe not found: ID does not exist" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.535158 4669 scope.go:117] "RemoveContainer" containerID="a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5" Oct 08 21:08:48 crc kubenswrapper[4669]: E1008 21:08:48.535507 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5\": container with ID starting with a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5 not found: ID does not exist" containerID="a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.535560 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5"} err="failed to get container status \"a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5\": rpc error: code = NotFound desc = could not find container \"a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5\": container with ID starting with a34f0c6c68e1f7a072a6c4a6c3afec22fcc7dca9faee582ff12525724c169fe5 not found: ID does not exist" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.535589 4669 scope.go:117] "RemoveContainer" containerID="30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b" Oct 08 21:08:48 crc kubenswrapper[4669]: E1008 21:08:48.535864 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b\": container with ID starting with 30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b not found: ID does not exist" containerID="30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b" Oct 08 21:08:48 crc kubenswrapper[4669]: I1008 21:08:48.535891 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b"} err="failed to get container status \"30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b\": rpc error: code = NotFound desc = could not find container \"30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b\": container with ID starting with 30c73d1cb3c0f1f48926c9f2743b86af72a019193f401ae6e1fed8401e8bb31b not found: ID does not exist" Oct 08 21:08:49 crc kubenswrapper[4669]: I1008 21:08:49.360306 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" path="/var/lib/kubelet/pods/2d3acdee-2992-4fa2-bcb3-e1ae588325d1/volumes" Oct 08 21:08:53 crc kubenswrapper[4669]: I1008 21:08:53.475883 4669 generic.go:334] "Generic (PLEG): container finished" podID="d54b9af7-032e-4b63-ada5-0cebab9e052d" containerID="39b28d758ce579aa8a57a854cfd1862dc80b74574bd60841917cb1659a7a1831" exitCode=0 Oct 08 21:08:53 crc kubenswrapper[4669]: I1008 21:08:53.475967 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" event={"ID":"d54b9af7-032e-4b63-ada5-0cebab9e052d","Type":"ContainerDied","Data":"39b28d758ce579aa8a57a854cfd1862dc80b74574bd60841917cb1659a7a1831"} Oct 08 21:08:54 crc kubenswrapper[4669]: I1008 21:08:54.903733 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.032122 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-inventory\") pod \"d54b9af7-032e-4b63-ada5-0cebab9e052d\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.032274 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llclc\" (UniqueName: \"kubernetes.io/projected/d54b9af7-032e-4b63-ada5-0cebab9e052d-kube-api-access-llclc\") pod \"d54b9af7-032e-4b63-ada5-0cebab9e052d\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.032296 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-ssh-key\") pod \"d54b9af7-032e-4b63-ada5-0cebab9e052d\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.032383 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-bootstrap-combined-ca-bundle\") pod \"d54b9af7-032e-4b63-ada5-0cebab9e052d\" (UID: \"d54b9af7-032e-4b63-ada5-0cebab9e052d\") " Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.042145 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d54b9af7-032e-4b63-ada5-0cebab9e052d" (UID: "d54b9af7-032e-4b63-ada5-0cebab9e052d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.042704 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54b9af7-032e-4b63-ada5-0cebab9e052d-kube-api-access-llclc" (OuterVolumeSpecName: "kube-api-access-llclc") pod "d54b9af7-032e-4b63-ada5-0cebab9e052d" (UID: "d54b9af7-032e-4b63-ada5-0cebab9e052d"). InnerVolumeSpecName "kube-api-access-llclc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.085215 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d54b9af7-032e-4b63-ada5-0cebab9e052d" (UID: "d54b9af7-032e-4b63-ada5-0cebab9e052d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.090805 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-inventory" (OuterVolumeSpecName: "inventory") pod "d54b9af7-032e-4b63-ada5-0cebab9e052d" (UID: "d54b9af7-032e-4b63-ada5-0cebab9e052d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.134845 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.134902 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llclc\" (UniqueName: \"kubernetes.io/projected/d54b9af7-032e-4b63-ada5-0cebab9e052d-kube-api-access-llclc\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.134924 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.134943 4669 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b9af7-032e-4b63-ada5-0cebab9e052d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.503075 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" event={"ID":"d54b9af7-032e-4b63-ada5-0cebab9e052d","Type":"ContainerDied","Data":"3d95453f94050bba7257bce9f1de064b04bb110f613399483e72503e21161462"} Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.503435 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d95453f94050bba7257bce9f1de064b04bb110f613399483e72503e21161462" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.503129 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.589563 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw"] Oct 08 21:08:55 crc kubenswrapper[4669]: E1008 21:08:55.589967 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54b9af7-032e-4b63-ada5-0cebab9e052d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.589989 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54b9af7-032e-4b63-ada5-0cebab9e052d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 08 21:08:55 crc kubenswrapper[4669]: E1008 21:08:55.589999 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="registry-server" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.590005 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="registry-server" Oct 08 21:08:55 crc kubenswrapper[4669]: E1008 21:08:55.590018 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="extract-utilities" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.590024 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="extract-utilities" Oct 08 21:08:55 crc kubenswrapper[4669]: E1008 21:08:55.590044 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="extract-content" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.590050 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="extract-content" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.590223 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54b9af7-032e-4b63-ada5-0cebab9e052d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.590245 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3acdee-2992-4fa2-bcb3-e1ae588325d1" containerName="registry-server" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.590901 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.593002 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.597122 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.597122 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.597500 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.601647 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw"] Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.644351 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.644399 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.644569 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p58gt\" (UniqueName: \"kubernetes.io/projected/8f08a947-ff60-4018-80f6-0098a257eddf-kube-api-access-p58gt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.745956 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p58gt\" (UniqueName: \"kubernetes.io/projected/8f08a947-ff60-4018-80f6-0098a257eddf-kube-api-access-p58gt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.746144 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.746196 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.750070 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.754895 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.769461 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p58gt\" (UniqueName: \"kubernetes.io/projected/8f08a947-ff60-4018-80f6-0098a257eddf-kube-api-access-p58gt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:55 crc kubenswrapper[4669]: I1008 21:08:55.910878 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:08:56 crc kubenswrapper[4669]: I1008 21:08:56.281810 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw"] Oct 08 21:08:56 crc kubenswrapper[4669]: I1008 21:08:56.515159 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" event={"ID":"8f08a947-ff60-4018-80f6-0098a257eddf","Type":"ContainerStarted","Data":"eab2623b73fa2172c48b98af8ef7ad91c53457ce769e90a52f9d06b7f42b6a9d"} Oct 08 21:08:56 crc kubenswrapper[4669]: I1008 21:08:56.848214 4669 scope.go:117] "RemoveContainer" containerID="a611dbe99b2b9a1ab05c3f8dbc11695bec31bb872eee5c9ac7e0e03f2a9bf968" Oct 08 21:08:56 crc kubenswrapper[4669]: I1008 21:08:56.982930 4669 scope.go:117] "RemoveContainer" containerID="68200dd7ed67f9c07e099797a8e52944bd4a53f72bd1d77f3a7f8f8d4006599c" Oct 08 21:08:57 crc kubenswrapper[4669]: I1008 21:08:57.003773 4669 scope.go:117] "RemoveContainer" containerID="ef3ebfe38f71ff1958823c5c287522a17052c7de0c0dbb96a55de18b106d969a" Oct 08 21:08:57 crc kubenswrapper[4669]: I1008 21:08:57.023674 4669 scope.go:117] "RemoveContainer" containerID="c3a915140c1cf43a9b2bb277e52f405b7075617df2c45c7ecc4292749e86a822" Oct 08 21:08:57 crc kubenswrapper[4669]: I1008 21:08:57.525197 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" event={"ID":"8f08a947-ff60-4018-80f6-0098a257eddf","Type":"ContainerStarted","Data":"e3cb1189db4b95f82129a28ec73d23ef675bf7ecf09cf36ec4a51bcee0e3f833"} Oct 08 21:08:57 crc kubenswrapper[4669]: I1008 21:08:57.542155 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" podStartSLOduration=1.9882109140000002 podStartE2EDuration="2.542135577s" podCreationTimestamp="2025-10-08 21:08:55 +0000 UTC" firstStartedPulling="2025-10-08 21:08:56.287263187 +0000 UTC m=+1455.980073880" lastFinishedPulling="2025-10-08 21:08:56.84118786 +0000 UTC m=+1456.533998543" observedRunningTime="2025-10-08 21:08:57.539832584 +0000 UTC m=+1457.232643257" watchObservedRunningTime="2025-10-08 21:08:57.542135577 +0000 UTC m=+1457.234946250" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.382101 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nvpnw"] Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.386035 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.416631 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvpnw"] Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.536674 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-utilities\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.536993 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbttg\" (UniqueName: \"kubernetes.io/projected/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-kube-api-access-qbttg\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.537402 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-catalog-content\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.639926 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-utilities\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.640032 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbttg\" (UniqueName: \"kubernetes.io/projected/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-kube-api-access-qbttg\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.640097 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-catalog-content\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.640450 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-utilities\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.640467 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-catalog-content\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.669335 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbttg\" (UniqueName: \"kubernetes.io/projected/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-kube-api-access-qbttg\") pod \"redhat-marketplace-nvpnw\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:08:59 crc kubenswrapper[4669]: I1008 21:08:59.712772 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:00 crc kubenswrapper[4669]: I1008 21:09:00.240324 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvpnw"] Oct 08 21:09:00 crc kubenswrapper[4669]: W1008 21:09:00.242564 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e49ec1_ac26_4a6c_95bd_f4a1aa29bdd3.slice/crio-68c73e7f1225face38e0a86e8ea993c77b521601419c8645694eb8072bb19374 WatchSource:0}: Error finding container 68c73e7f1225face38e0a86e8ea993c77b521601419c8645694eb8072bb19374: Status 404 returned error can't find the container with id 68c73e7f1225face38e0a86e8ea993c77b521601419c8645694eb8072bb19374 Oct 08 21:09:00 crc kubenswrapper[4669]: I1008 21:09:00.559746 4669 generic.go:334] "Generic (PLEG): container finished" podID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerID="e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6" exitCode=0 Oct 08 21:09:00 crc kubenswrapper[4669]: I1008 21:09:00.559826 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvpnw" event={"ID":"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3","Type":"ContainerDied","Data":"e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6"} Oct 08 21:09:00 crc kubenswrapper[4669]: I1008 21:09:00.560173 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvpnw" event={"ID":"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3","Type":"ContainerStarted","Data":"68c73e7f1225face38e0a86e8ea993c77b521601419c8645694eb8072bb19374"} Oct 08 21:09:01 crc kubenswrapper[4669]: I1008 21:09:01.573420 4669 generic.go:334] "Generic (PLEG): container finished" podID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerID="d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4" exitCode=0 Oct 08 21:09:01 crc kubenswrapper[4669]: I1008 21:09:01.573509 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvpnw" event={"ID":"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3","Type":"ContainerDied","Data":"d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4"} Oct 08 21:09:02 crc kubenswrapper[4669]: I1008 21:09:02.587739 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvpnw" event={"ID":"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3","Type":"ContainerStarted","Data":"c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef"} Oct 08 21:09:02 crc kubenswrapper[4669]: I1008 21:09:02.613790 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nvpnw" podStartSLOduration=2.115941935 podStartE2EDuration="3.613753992s" podCreationTimestamp="2025-10-08 21:08:59 +0000 UTC" firstStartedPulling="2025-10-08 21:09:00.56303868 +0000 UTC m=+1460.255849383" lastFinishedPulling="2025-10-08 21:09:02.060850727 +0000 UTC m=+1461.753661440" observedRunningTime="2025-10-08 21:09:02.604107929 +0000 UTC m=+1462.296918602" watchObservedRunningTime="2025-10-08 21:09:02.613753992 +0000 UTC m=+1462.306564665" Oct 08 21:09:09 crc kubenswrapper[4669]: I1008 21:09:09.713399 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:09 crc kubenswrapper[4669]: I1008 21:09:09.714080 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:09 crc kubenswrapper[4669]: I1008 21:09:09.807176 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:10 crc kubenswrapper[4669]: I1008 21:09:10.728390 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:10 crc kubenswrapper[4669]: I1008 21:09:10.777853 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvpnw"] Oct 08 21:09:12 crc kubenswrapper[4669]: I1008 21:09:12.699858 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nvpnw" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="registry-server" containerID="cri-o://c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef" gracePeriod=2 Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.162088 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.185379 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.185429 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.185537 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.186297 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.186349 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" gracePeriod=600 Oct 08 21:09:13 crc kubenswrapper[4669]: E1008 21:09:13.314241 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.349403 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-catalog-content\") pod \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.349486 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbttg\" (UniqueName: \"kubernetes.io/projected/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-kube-api-access-qbttg\") pod \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.349760 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-utilities\") pod \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\" (UID: \"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3\") " Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.351169 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-utilities" (OuterVolumeSpecName: "utilities") pod "63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" (UID: "63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.356250 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-kube-api-access-qbttg" (OuterVolumeSpecName: "kube-api-access-qbttg") pod "63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" (UID: "63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3"). InnerVolumeSpecName "kube-api-access-qbttg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.362164 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" (UID: "63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.453014 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.453050 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.453066 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbttg\" (UniqueName: \"kubernetes.io/projected/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3-kube-api-access-qbttg\") on node \"crc\" DevicePath \"\"" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.713079 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" exitCode=0 Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.713291 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074"} Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.713436 4669 scope.go:117] "RemoveContainer" containerID="7d9c1843fe5022c993d21347e4a9434b43afdaabe05d722d7ab0e85541c9821e" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.714143 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:09:13 crc kubenswrapper[4669]: E1008 21:09:13.714500 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.718324 4669 generic.go:334] "Generic (PLEG): container finished" podID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerID="c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef" exitCode=0 Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.718367 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvpnw" event={"ID":"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3","Type":"ContainerDied","Data":"c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef"} Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.718395 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvpnw" event={"ID":"63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3","Type":"ContainerDied","Data":"68c73e7f1225face38e0a86e8ea993c77b521601419c8645694eb8072bb19374"} Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.718399 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvpnw" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.760849 4669 scope.go:117] "RemoveContainer" containerID="c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.791353 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvpnw"] Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.791592 4669 scope.go:117] "RemoveContainer" containerID="d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.804411 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvpnw"] Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.828004 4669 scope.go:117] "RemoveContainer" containerID="e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.884064 4669 scope.go:117] "RemoveContainer" containerID="c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef" Oct 08 21:09:13 crc kubenswrapper[4669]: E1008 21:09:13.884494 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef\": container with ID starting with c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef not found: ID does not exist" containerID="c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.884583 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef"} err="failed to get container status \"c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef\": rpc error: code = NotFound desc = could not find container \"c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef\": container with ID starting with c3cc24a8331288861003a117692482f36653ad9fa6f172d19332a283e53238ef not found: ID does not exist" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.884607 4669 scope.go:117] "RemoveContainer" containerID="d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4" Oct 08 21:09:13 crc kubenswrapper[4669]: E1008 21:09:13.884822 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4\": container with ID starting with d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4 not found: ID does not exist" containerID="d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.884845 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4"} err="failed to get container status \"d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4\": rpc error: code = NotFound desc = could not find container \"d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4\": container with ID starting with d3b40e08472ce8c822d29df7658a79ca8e75a4a47e014406a867307ce6aa92a4 not found: ID does not exist" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.884858 4669 scope.go:117] "RemoveContainer" containerID="e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6" Oct 08 21:09:13 crc kubenswrapper[4669]: E1008 21:09:13.885213 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6\": container with ID starting with e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6 not found: ID does not exist" containerID="e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6" Oct 08 21:09:13 crc kubenswrapper[4669]: I1008 21:09:13.885237 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6"} err="failed to get container status \"e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6\": rpc error: code = NotFound desc = could not find container \"e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6\": container with ID starting with e4f74b9bac1ac7c8059239c5cc97844d97a0982db076aa989d1d8521df022fb6 not found: ID does not exist" Oct 08 21:09:15 crc kubenswrapper[4669]: I1008 21:09:15.345693 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" path="/var/lib/kubelet/pods/63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3/volumes" Oct 08 21:09:28 crc kubenswrapper[4669]: I1008 21:09:28.331306 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:09:28 crc kubenswrapper[4669]: E1008 21:09:28.332101 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:09:39 crc kubenswrapper[4669]: I1008 21:09:39.330791 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:09:39 crc kubenswrapper[4669]: E1008 21:09:39.332442 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:09:52 crc kubenswrapper[4669]: I1008 21:09:52.330932 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:09:52 crc kubenswrapper[4669]: E1008 21:09:52.331891 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:09:56 crc kubenswrapper[4669]: I1008 21:09:56.054428 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wl4fg"] Oct 08 21:09:56 crc kubenswrapper[4669]: I1008 21:09:56.068914 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-t2lt9"] Oct 08 21:09:56 crc kubenswrapper[4669]: I1008 21:09:56.082568 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zgpxd"] Oct 08 21:09:56 crc kubenswrapper[4669]: I1008 21:09:56.090752 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wl4fg"] Oct 08 21:09:56 crc kubenswrapper[4669]: I1008 21:09:56.098326 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-t2lt9"] Oct 08 21:09:56 crc kubenswrapper[4669]: I1008 21:09:56.104887 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zgpxd"] Oct 08 21:09:57 crc kubenswrapper[4669]: I1008 21:09:57.343376 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a191427-574e-4eb1-bb13-c6e494b4ca5b" path="/var/lib/kubelet/pods/0a191427-574e-4eb1-bb13-c6e494b4ca5b/volumes" Oct 08 21:09:57 crc kubenswrapper[4669]: I1008 21:09:57.344350 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e5e134-19bb-471d-b4d2-c3344d70fdd1" path="/var/lib/kubelet/pods/39e5e134-19bb-471d-b4d2-c3344d70fdd1/volumes" Oct 08 21:09:57 crc kubenswrapper[4669]: I1008 21:09:57.345036 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed80c2fd-2488-49d3-a30a-137a44370e04" path="/var/lib/kubelet/pods/ed80c2fd-2488-49d3-a30a-137a44370e04/volumes" Oct 08 21:10:05 crc kubenswrapper[4669]: I1008 21:10:05.041142 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0587-account-create-qbln8"] Oct 08 21:10:05 crc kubenswrapper[4669]: I1008 21:10:05.050741 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0587-account-create-qbln8"] Oct 08 21:10:05 crc kubenswrapper[4669]: I1008 21:10:05.341181 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52156717-724a-4a76-aee8-73e4029ea3a4" path="/var/lib/kubelet/pods/52156717-724a-4a76-aee8-73e4029ea3a4/volumes" Oct 08 21:10:06 crc kubenswrapper[4669]: I1008 21:10:06.027767 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4aa4-account-create-g5gbd"] Oct 08 21:10:06 crc kubenswrapper[4669]: I1008 21:10:06.035704 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-efd5-account-create-vtsr8"] Oct 08 21:10:06 crc kubenswrapper[4669]: I1008 21:10:06.043869 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-efd5-account-create-vtsr8"] Oct 08 21:10:06 crc kubenswrapper[4669]: I1008 21:10:06.051775 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4aa4-account-create-g5gbd"] Oct 08 21:10:06 crc kubenswrapper[4669]: I1008 21:10:06.331349 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:10:06 crc kubenswrapper[4669]: E1008 21:10:06.331622 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:10:07 crc kubenswrapper[4669]: I1008 21:10:07.349606 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9f456f-98d4-4073-a9e5-78c4426cdc20" path="/var/lib/kubelet/pods/8b9f456f-98d4-4073-a9e5-78c4426cdc20/volumes" Oct 08 21:10:07 crc kubenswrapper[4669]: I1008 21:10:07.350246 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfd360c-b8d6-4e66-bc00-96b2dd0252f5" path="/var/lib/kubelet/pods/ecfd360c-b8d6-4e66-bc00-96b2dd0252f5/volumes" Oct 08 21:10:17 crc kubenswrapper[4669]: I1008 21:10:17.331509 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:10:17 crc kubenswrapper[4669]: E1008 21:10:17.332618 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.043715 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-gt2lp"] Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.055162 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lkvvk"] Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.066286 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-r5bs4"] Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.075286 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-gt2lp"] Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.084030 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-r5bs4"] Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.093179 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lkvvk"] Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.343625 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fe45b8-92b8-47ff-9f71-c908c64e2866" path="/var/lib/kubelet/pods/a3fe45b8-92b8-47ff-9f71-c908c64e2866/volumes" Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.344250 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2134180-0294-4e53-bc38-5d062b5585ff" path="/var/lib/kubelet/pods/f2134180-0294-4e53-bc38-5d062b5585ff/volumes" Oct 08 21:10:27 crc kubenswrapper[4669]: I1008 21:10:27.345125 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7" path="/var/lib/kubelet/pods/ffe90a4e-1ec4-4628-bada-7e2d0eedb4d7/volumes" Oct 08 21:10:28 crc kubenswrapper[4669]: I1008 21:10:28.496591 4669 generic.go:334] "Generic (PLEG): container finished" podID="8f08a947-ff60-4018-80f6-0098a257eddf" containerID="e3cb1189db4b95f82129a28ec73d23ef675bf7ecf09cf36ec4a51bcee0e3f833" exitCode=0 Oct 08 21:10:28 crc kubenswrapper[4669]: I1008 21:10:28.497027 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" event={"ID":"8f08a947-ff60-4018-80f6-0098a257eddf","Type":"ContainerDied","Data":"e3cb1189db4b95f82129a28ec73d23ef675bf7ecf09cf36ec4a51bcee0e3f833"} Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.011771 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.027256 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8b8fw"] Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.044078 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8b8fw"] Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.113752 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-inventory\") pod \"8f08a947-ff60-4018-80f6-0098a257eddf\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.113979 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p58gt\" (UniqueName: \"kubernetes.io/projected/8f08a947-ff60-4018-80f6-0098a257eddf-kube-api-access-p58gt\") pod \"8f08a947-ff60-4018-80f6-0098a257eddf\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.114621 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-ssh-key\") pod \"8f08a947-ff60-4018-80f6-0098a257eddf\" (UID: \"8f08a947-ff60-4018-80f6-0098a257eddf\") " Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.123723 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f08a947-ff60-4018-80f6-0098a257eddf-kube-api-access-p58gt" (OuterVolumeSpecName: "kube-api-access-p58gt") pod "8f08a947-ff60-4018-80f6-0098a257eddf" (UID: "8f08a947-ff60-4018-80f6-0098a257eddf"). InnerVolumeSpecName "kube-api-access-p58gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.140246 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-inventory" (OuterVolumeSpecName: "inventory") pod "8f08a947-ff60-4018-80f6-0098a257eddf" (UID: "8f08a947-ff60-4018-80f6-0098a257eddf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.142852 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f08a947-ff60-4018-80f6-0098a257eddf" (UID: "8f08a947-ff60-4018-80f6-0098a257eddf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.217519 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p58gt\" (UniqueName: \"kubernetes.io/projected/8f08a947-ff60-4018-80f6-0098a257eddf-kube-api-access-p58gt\") on node \"crc\" DevicePath \"\"" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.217650 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.217672 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f08a947-ff60-4018-80f6-0098a257eddf-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.523864 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" event={"ID":"8f08a947-ff60-4018-80f6-0098a257eddf","Type":"ContainerDied","Data":"eab2623b73fa2172c48b98af8ef7ad91c53457ce769e90a52f9d06b7f42b6a9d"} Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.523933 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab2623b73fa2172c48b98af8ef7ad91c53457ce769e90a52f9d06b7f42b6a9d" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.523897 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.631986 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx"] Oct 08 21:10:30 crc kubenswrapper[4669]: E1008 21:10:30.632741 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="registry-server" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.632760 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="registry-server" Oct 08 21:10:30 crc kubenswrapper[4669]: E1008 21:10:30.632801 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f08a947-ff60-4018-80f6-0098a257eddf" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.632813 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f08a947-ff60-4018-80f6-0098a257eddf" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 08 21:10:30 crc kubenswrapper[4669]: E1008 21:10:30.632831 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="extract-content" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.632839 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="extract-content" Oct 08 21:10:30 crc kubenswrapper[4669]: E1008 21:10:30.632856 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="extract-utilities" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.632864 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="extract-utilities" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.633072 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f08a947-ff60-4018-80f6-0098a257eddf" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.633094 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e49ec1-ac26-4a6c-95bd-f4a1aa29bdd3" containerName="registry-server" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.633902 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.636137 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.636739 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.637914 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.639642 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.651478 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx"] Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.728475 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.728617 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.728645 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cm6\" (UniqueName: \"kubernetes.io/projected/e84616ce-4d73-4f8f-85b3-cca04e509792-kube-api-access-24cm6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.830370 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.830515 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.830564 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24cm6\" (UniqueName: \"kubernetes.io/projected/e84616ce-4d73-4f8f-85b3-cca04e509792-kube-api-access-24cm6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.837087 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.838211 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.852058 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24cm6\" (UniqueName: \"kubernetes.io/projected/e84616ce-4d73-4f8f-85b3-cca04e509792-kube-api-access-24cm6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:30 crc kubenswrapper[4669]: I1008 21:10:30.955813 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:10:31 crc kubenswrapper[4669]: I1008 21:10:31.052828 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-z2w59"] Oct 08 21:10:31 crc kubenswrapper[4669]: I1008 21:10:31.060819 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-z2w59"] Oct 08 21:10:31 crc kubenswrapper[4669]: I1008 21:10:31.346903 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057267cc-ede4-488e-94a2-37caa8cb9557" path="/var/lib/kubelet/pods/057267cc-ede4-488e-94a2-37caa8cb9557/volumes" Oct 08 21:10:31 crc kubenswrapper[4669]: I1008 21:10:31.348802 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbf7e56-6c38-4ee3-8096-875162b3576f" path="/var/lib/kubelet/pods/9cbf7e56-6c38-4ee3-8096-875162b3576f/volumes" Oct 08 21:10:31 crc kubenswrapper[4669]: I1008 21:10:31.575073 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx"] Oct 08 21:10:31 crc kubenswrapper[4669]: I1008 21:10:31.577266 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:10:32 crc kubenswrapper[4669]: I1008 21:10:32.331418 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:10:32 crc kubenswrapper[4669]: E1008 21:10:32.331672 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:10:32 crc kubenswrapper[4669]: I1008 21:10:32.546775 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" event={"ID":"e84616ce-4d73-4f8f-85b3-cca04e509792","Type":"ContainerStarted","Data":"94a92af6ea51115ea913ae5195223818e128544d601fa32e4dd340fde53c549b"} Oct 08 21:10:32 crc kubenswrapper[4669]: I1008 21:10:32.547458 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" event={"ID":"e84616ce-4d73-4f8f-85b3-cca04e509792","Type":"ContainerStarted","Data":"72eb1728b8edb198f90fc65efba697ad9f5eea9d1b9f2207fcd14a472989e855"} Oct 08 21:10:32 crc kubenswrapper[4669]: I1008 21:10:32.578042 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" podStartSLOduration=2.104028641 podStartE2EDuration="2.578020018s" podCreationTimestamp="2025-10-08 21:10:30 +0000 UTC" firstStartedPulling="2025-10-08 21:10:31.577077105 +0000 UTC m=+1551.269887778" lastFinishedPulling="2025-10-08 21:10:32.051068472 +0000 UTC m=+1551.743879155" observedRunningTime="2025-10-08 21:10:32.568466437 +0000 UTC m=+1552.261277120" watchObservedRunningTime="2025-10-08 21:10:32.578020018 +0000 UTC m=+1552.270830701" Oct 08 21:10:45 crc kubenswrapper[4669]: I1008 21:10:45.332066 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:10:45 crc kubenswrapper[4669]: E1008 21:10:45.332830 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:10:52 crc kubenswrapper[4669]: I1008 21:10:52.041073 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3920-account-create-sw6j7"] Oct 08 21:10:52 crc kubenswrapper[4669]: I1008 21:10:52.053450 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fb17-account-create-rc2jm"] Oct 08 21:10:52 crc kubenswrapper[4669]: I1008 21:10:52.066006 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-fb17-account-create-rc2jm"] Oct 08 21:10:52 crc kubenswrapper[4669]: I1008 21:10:52.073390 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3920-account-create-sw6j7"] Oct 08 21:10:53 crc kubenswrapper[4669]: I1008 21:10:53.348673 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dfca695-7bff-4cb6-abdf-6e20eb14485e" path="/var/lib/kubelet/pods/3dfca695-7bff-4cb6-abdf-6e20eb14485e/volumes" Oct 08 21:10:53 crc kubenswrapper[4669]: I1008 21:10:53.349462 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daca45a8-2ae8-4b87-9fc4-348099b37165" path="/var/lib/kubelet/pods/daca45a8-2ae8-4b87-9fc4-348099b37165/volumes" Oct 08 21:10:54 crc kubenswrapper[4669]: I1008 21:10:54.027213 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-82f4-account-create-z8lzz"] Oct 08 21:10:54 crc kubenswrapper[4669]: I1008 21:10:54.035177 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-82f4-account-create-z8lzz"] Oct 08 21:10:55 crc kubenswrapper[4669]: I1008 21:10:55.346478 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af60e370-8287-408b-af8c-fc0d5c19e37d" path="/var/lib/kubelet/pods/af60e370-8287-408b-af8c-fc0d5c19e37d/volumes" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.153157 4669 scope.go:117] "RemoveContainer" containerID="77ea3d9d9939f0e2434182cfc13c1f6fa00dc35709c6c5503958896ff06a05c5" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.195759 4669 scope.go:117] "RemoveContainer" containerID="1c760e09b40b028156b7d84f66b605d10c2b25150422362d3cca468eb4d61bc0" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.266913 4669 scope.go:117] "RemoveContainer" containerID="33f0e82a3a750fcc4619d98960ce857a29fb7e4077c703ea4c10af050bef930f" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.303791 4669 scope.go:117] "RemoveContainer" containerID="651fe547f86a83c9a670d0ee6fa2f51d1abe5840616e9b900217192bffcb0a94" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.353700 4669 scope.go:117] "RemoveContainer" containerID="329440f2339893b6701e370e47f52b16dbbf6b0bccaa1c6b33e1e2c63745799c" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.407368 4669 scope.go:117] "RemoveContainer" containerID="f045916f1a7ed045a3ae239aaf04b76d8a44aa518da56654afc20b311f704b4e" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.458694 4669 scope.go:117] "RemoveContainer" containerID="6b78ce690d29ad48209f564e7a610ebed2a44fa6659b303692d0d9baf9cca697" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.475079 4669 scope.go:117] "RemoveContainer" containerID="30a3abfd5c38b5dd4c54dcb1e009858c35074d0e4919ba69b86e4f1215737f9b" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.493111 4669 scope.go:117] "RemoveContainer" containerID="6320feee73a5c5a8642b1c4b37fe47aaa8a70d18b00db9af83d8fc321a039cc8" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.509516 4669 scope.go:117] "RemoveContainer" containerID="1a76b531b8d72c5bae6441b61d77740d192e448d51b9c3ba40a96dbec233e0ed" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.540082 4669 scope.go:117] "RemoveContainer" containerID="6dd241c048d987bcfc3be789765a5bdfd8191a5c58c43fc41847a94e0f72a11c" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.573757 4669 scope.go:117] "RemoveContainer" containerID="9e6c17bf14bc007fd2987de2fdc1e6186e2260d168e71924c73c17f12f32d544" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.591958 4669 scope.go:117] "RemoveContainer" containerID="ec1aa2fc9ef7f9a3d5ec48f37c550f4cb060218d8d5d3d957a51d9e554fa68c0" Oct 08 21:10:57 crc kubenswrapper[4669]: I1008 21:10:57.614164 4669 scope.go:117] "RemoveContainer" containerID="8c6c92a3641e309128c39a1aac34c91c687acb49b952df26c073d7102d6ba9bd" Oct 08 21:10:59 crc kubenswrapper[4669]: I1008 21:10:59.330876 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:10:59 crc kubenswrapper[4669]: E1008 21:10:59.331411 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:11:01 crc kubenswrapper[4669]: I1008 21:11:01.045436 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q5btg"] Oct 08 21:11:01 crc kubenswrapper[4669]: I1008 21:11:01.052650 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q5btg"] Oct 08 21:11:01 crc kubenswrapper[4669]: I1008 21:11:01.342445 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ef52de-f02e-447a-8713-16ce12443117" path="/var/lib/kubelet/pods/42ef52de-f02e-447a-8713-16ce12443117/volumes" Oct 08 21:11:13 crc kubenswrapper[4669]: I1008 21:11:13.331594 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:11:13 crc kubenswrapper[4669]: E1008 21:11:13.332740 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:11:21 crc kubenswrapper[4669]: I1008 21:11:21.034208 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vcvj2"] Oct 08 21:11:21 crc kubenswrapper[4669]: I1008 21:11:21.044660 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vcvj2"] Oct 08 21:11:21 crc kubenswrapper[4669]: I1008 21:11:21.371377 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9dde7a-05ec-4edb-b865-8e16680527a5" path="/var/lib/kubelet/pods/8a9dde7a-05ec-4edb-b865-8e16680527a5/volumes" Oct 08 21:11:24 crc kubenswrapper[4669]: I1008 21:11:24.027256 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-p2bl6"] Oct 08 21:11:24 crc kubenswrapper[4669]: I1008 21:11:24.035652 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-p2bl6"] Oct 08 21:11:24 crc kubenswrapper[4669]: I1008 21:11:24.332296 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:11:24 crc kubenswrapper[4669]: E1008 21:11:24.333111 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:11:25 crc kubenswrapper[4669]: I1008 21:11:25.026289 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rflnq"] Oct 08 21:11:25 crc kubenswrapper[4669]: I1008 21:11:25.033035 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rflnq"] Oct 08 21:11:25 crc kubenswrapper[4669]: I1008 21:11:25.344905 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469707ec-e817-4d42-b406-6595799f6036" path="/var/lib/kubelet/pods/469707ec-e817-4d42-b406-6595799f6036/volumes" Oct 08 21:11:25 crc kubenswrapper[4669]: I1008 21:11:25.345676 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbdb37fe-4acd-400c-aff5-db2a90f07a32" path="/var/lib/kubelet/pods/fbdb37fe-4acd-400c-aff5-db2a90f07a32/volumes" Oct 08 21:11:37 crc kubenswrapper[4669]: I1008 21:11:37.332178 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:11:37 crc kubenswrapper[4669]: E1008 21:11:37.333560 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:11:41 crc kubenswrapper[4669]: I1008 21:11:41.067191 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-j5jf6"] Oct 08 21:11:41 crc kubenswrapper[4669]: I1008 21:11:41.076260 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-j5jf6"] Oct 08 21:11:41 crc kubenswrapper[4669]: I1008 21:11:41.342055 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b56cd8-5692-4a65-b8ad-1e39bf253846" path="/var/lib/kubelet/pods/14b56cd8-5692-4a65-b8ad-1e39bf253846/volumes" Oct 08 21:11:50 crc kubenswrapper[4669]: I1008 21:11:50.331441 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:11:50 crc kubenswrapper[4669]: E1008 21:11:50.332638 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:11:50 crc kubenswrapper[4669]: I1008 21:11:50.385200 4669 generic.go:334] "Generic (PLEG): container finished" podID="e84616ce-4d73-4f8f-85b3-cca04e509792" containerID="94a92af6ea51115ea913ae5195223818e128544d601fa32e4dd340fde53c549b" exitCode=0 Oct 08 21:11:50 crc kubenswrapper[4669]: I1008 21:11:50.385255 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" event={"ID":"e84616ce-4d73-4f8f-85b3-cca04e509792","Type":"ContainerDied","Data":"94a92af6ea51115ea913ae5195223818e128544d601fa32e4dd340fde53c549b"} Oct 08 21:11:51 crc kubenswrapper[4669]: I1008 21:11:51.860921 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.004343 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24cm6\" (UniqueName: \"kubernetes.io/projected/e84616ce-4d73-4f8f-85b3-cca04e509792-kube-api-access-24cm6\") pod \"e84616ce-4d73-4f8f-85b3-cca04e509792\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.004456 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-ssh-key\") pod \"e84616ce-4d73-4f8f-85b3-cca04e509792\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.004593 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-inventory\") pod \"e84616ce-4d73-4f8f-85b3-cca04e509792\" (UID: \"e84616ce-4d73-4f8f-85b3-cca04e509792\") " Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.010891 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84616ce-4d73-4f8f-85b3-cca04e509792-kube-api-access-24cm6" (OuterVolumeSpecName: "kube-api-access-24cm6") pod "e84616ce-4d73-4f8f-85b3-cca04e509792" (UID: "e84616ce-4d73-4f8f-85b3-cca04e509792"). InnerVolumeSpecName "kube-api-access-24cm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.035803 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-inventory" (OuterVolumeSpecName: "inventory") pod "e84616ce-4d73-4f8f-85b3-cca04e509792" (UID: "e84616ce-4d73-4f8f-85b3-cca04e509792"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.039751 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e84616ce-4d73-4f8f-85b3-cca04e509792" (UID: "e84616ce-4d73-4f8f-85b3-cca04e509792"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.106942 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24cm6\" (UniqueName: \"kubernetes.io/projected/e84616ce-4d73-4f8f-85b3-cca04e509792-kube-api-access-24cm6\") on node \"crc\" DevicePath \"\"" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.106971 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.106980 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e84616ce-4d73-4f8f-85b3-cca04e509792-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.410907 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" event={"ID":"e84616ce-4d73-4f8f-85b3-cca04e509792","Type":"ContainerDied","Data":"72eb1728b8edb198f90fc65efba697ad9f5eea9d1b9f2207fcd14a472989e855"} Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.410957 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72eb1728b8edb198f90fc65efba697ad9f5eea9d1b9f2207fcd14a472989e855" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.411020 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.519524 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr"] Oct 08 21:11:52 crc kubenswrapper[4669]: E1008 21:11:52.520110 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84616ce-4d73-4f8f-85b3-cca04e509792" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.520139 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84616ce-4d73-4f8f-85b3-cca04e509792" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.520484 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84616ce-4d73-4f8f-85b3-cca04e509792" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.521278 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.523783 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.523848 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.523923 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.524652 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.529608 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr"] Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.640977 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.641297 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.641388 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mv4h\" (UniqueName: \"kubernetes.io/projected/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-kube-api-access-2mv4h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.744485 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.744682 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mv4h\" (UniqueName: \"kubernetes.io/projected/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-kube-api-access-2mv4h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.744867 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.751101 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.752886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.775489 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mv4h\" (UniqueName: \"kubernetes.io/projected/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-kube-api-access-2mv4h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:52 crc kubenswrapper[4669]: I1008 21:11:52.839285 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:11:53 crc kubenswrapper[4669]: I1008 21:11:53.399386 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr"] Oct 08 21:11:53 crc kubenswrapper[4669]: I1008 21:11:53.427060 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" event={"ID":"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e","Type":"ContainerStarted","Data":"1e89ee9c53c28d30c2f51d632a1383d0fa94002cfc865c447a1802140da6f7fe"} Oct 08 21:11:54 crc kubenswrapper[4669]: I1008 21:11:54.442601 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" event={"ID":"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e","Type":"ContainerStarted","Data":"1e915a1c77aa5110e570a512c81727d92811c1624bb4340969d3646946d51ca3"} Oct 08 21:11:54 crc kubenswrapper[4669]: I1008 21:11:54.461155 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" podStartSLOduration=1.8407745709999999 podStartE2EDuration="2.461133998s" podCreationTimestamp="2025-10-08 21:11:52 +0000 UTC" firstStartedPulling="2025-10-08 21:11:53.41482186 +0000 UTC m=+1633.107632553" lastFinishedPulling="2025-10-08 21:11:54.035181297 +0000 UTC m=+1633.727991980" observedRunningTime="2025-10-08 21:11:54.460968864 +0000 UTC m=+1634.153779537" watchObservedRunningTime="2025-10-08 21:11:54.461133998 +0000 UTC m=+1634.153944681" Oct 08 21:11:57 crc kubenswrapper[4669]: I1008 21:11:57.876319 4669 scope.go:117] "RemoveContainer" containerID="e51526a2c5f0cb4ae683f0b94b5da21317009d601e2a29c50f9241e6aa8afb0a" Oct 08 21:11:57 crc kubenswrapper[4669]: I1008 21:11:57.914446 4669 scope.go:117] "RemoveContainer" containerID="e5232846acfe858005c26feae8a9f6c2a5d6c16e613e0f54525b1bb9b6aa170f" Oct 08 21:11:57 crc kubenswrapper[4669]: I1008 21:11:57.966243 4669 scope.go:117] "RemoveContainer" containerID="8da2c877511a8683c6848823d437a4855d3eaa59ff371446d7dcbe07bb26d5cd" Oct 08 21:11:57 crc kubenswrapper[4669]: I1008 21:11:57.996451 4669 scope.go:117] "RemoveContainer" containerID="eb1864d51f9a83b3a0594f07487e8a54b8d75be746e794b1b6931c856df5fbb7" Oct 08 21:11:58 crc kubenswrapper[4669]: I1008 21:11:58.064840 4669 scope.go:117] "RemoveContainer" containerID="ce92f63cb28d86f30c343c6e01a5e1156e3b99401b7c98c55a397f77bf1ee8be" Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.039530 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-blx6m"] Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.048841 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lg6tp"] Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.057668 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tjtxx"] Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.063591 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-blx6m"] Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.069212 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lg6tp"] Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.074704 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tjtxx"] Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.347682 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ecc6d74-0b2e-4265-80ab-1229ce7427d2" path="/var/lib/kubelet/pods/7ecc6d74-0b2e-4265-80ab-1229ce7427d2/volumes" Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.348775 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5f87b9-2c24-4a27-9a32-1258486274ac" path="/var/lib/kubelet/pods/9f5f87b9-2c24-4a27-9a32-1258486274ac/volumes" Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.349605 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbed2eec-1d0d-49aa-89b2-69b961ee76b8" path="/var/lib/kubelet/pods/fbed2eec-1d0d-49aa-89b2-69b961ee76b8/volumes" Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.499749 4669 generic.go:334] "Generic (PLEG): container finished" podID="d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" containerID="1e915a1c77aa5110e570a512c81727d92811c1624bb4340969d3646946d51ca3" exitCode=0 Oct 08 21:11:59 crc kubenswrapper[4669]: I1008 21:11:59.499804 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" event={"ID":"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e","Type":"ContainerDied","Data":"1e915a1c77aa5110e570a512c81727d92811c1624bb4340969d3646946d51ca3"} Oct 08 21:12:00 crc kubenswrapper[4669]: I1008 21:12:00.925564 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.124157 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-inventory\") pod \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.124222 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mv4h\" (UniqueName: \"kubernetes.io/projected/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-kube-api-access-2mv4h\") pod \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.124309 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-ssh-key\") pod \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\" (UID: \"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e\") " Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.130604 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-kube-api-access-2mv4h" (OuterVolumeSpecName: "kube-api-access-2mv4h") pod "d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" (UID: "d7d06c6d-e606-429a-b7e6-e6e2609b3b4e"). InnerVolumeSpecName "kube-api-access-2mv4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.160733 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" (UID: "d7d06c6d-e606-429a-b7e6-e6e2609b3b4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.166788 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-inventory" (OuterVolumeSpecName: "inventory") pod "d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" (UID: "d7d06c6d-e606-429a-b7e6-e6e2609b3b4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.226992 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.227254 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.227521 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mv4h\" (UniqueName: \"kubernetes.io/projected/d7d06c6d-e606-429a-b7e6-e6e2609b3b4e-kube-api-access-2mv4h\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.518585 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" event={"ID":"d7d06c6d-e606-429a-b7e6-e6e2609b3b4e","Type":"ContainerDied","Data":"1e89ee9c53c28d30c2f51d632a1383d0fa94002cfc865c447a1802140da6f7fe"} Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.518622 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e89ee9c53c28d30c2f51d632a1383d0fa94002cfc865c447a1802140da6f7fe" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.519055 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.601256 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz"] Oct 08 21:12:01 crc kubenswrapper[4669]: E1008 21:12:01.601688 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.601708 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.601873 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d06c6d-e606-429a-b7e6-e6e2609b3b4e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.603363 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.612927 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.613028 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.613189 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.622827 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.648106 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz"] Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.738028 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.738365 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.738460 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqz2h\" (UniqueName: \"kubernetes.io/projected/30bc939f-3290-4c46-8d00-120c0bf33951-kube-api-access-nqz2h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.840788 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.840900 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.840922 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqz2h\" (UniqueName: \"kubernetes.io/projected/30bc939f-3290-4c46-8d00-120c0bf33951-kube-api-access-nqz2h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.844481 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.849347 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.857521 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqz2h\" (UniqueName: \"kubernetes.io/projected/30bc939f-3290-4c46-8d00-120c0bf33951-kube-api-access-nqz2h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hk4fz\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:01 crc kubenswrapper[4669]: I1008 21:12:01.939011 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:02 crc kubenswrapper[4669]: I1008 21:12:02.254576 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz"] Oct 08 21:12:02 crc kubenswrapper[4669]: I1008 21:12:02.527683 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" event={"ID":"30bc939f-3290-4c46-8d00-120c0bf33951","Type":"ContainerStarted","Data":"24b7a05038503d3c36640afbb8343d616a53a5599a2e6d7e25a0ef459f337264"} Oct 08 21:12:03 crc kubenswrapper[4669]: I1008 21:12:03.535694 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" event={"ID":"30bc939f-3290-4c46-8d00-120c0bf33951","Type":"ContainerStarted","Data":"ceb24fb6a70f68ee58725ac57baa74d9eee93167cbadf067c3fe3c5d0ad55230"} Oct 08 21:12:03 crc kubenswrapper[4669]: I1008 21:12:03.551405 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" podStartSLOduration=1.9697289260000002 podStartE2EDuration="2.551386107s" podCreationTimestamp="2025-10-08 21:12:01 +0000 UTC" firstStartedPulling="2025-10-08 21:12:02.271096176 +0000 UTC m=+1641.963906849" lastFinishedPulling="2025-10-08 21:12:02.852753347 +0000 UTC m=+1642.545564030" observedRunningTime="2025-10-08 21:12:03.550548843 +0000 UTC m=+1643.243359526" watchObservedRunningTime="2025-10-08 21:12:03.551386107 +0000 UTC m=+1643.244196790" Oct 08 21:12:04 crc kubenswrapper[4669]: I1008 21:12:04.332100 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:12:04 crc kubenswrapper[4669]: E1008 21:12:04.333180 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.038566 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9048-account-create-lpfml"] Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.047615 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8c82-account-create-l8gj7"] Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.054073 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-eed3-account-create-cxmm8"] Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.062522 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9048-account-create-lpfml"] Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.068903 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8c82-account-create-l8gj7"] Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.074669 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-eed3-account-create-cxmm8"] Oct 08 21:12:18 crc kubenswrapper[4669]: I1008 21:12:18.331261 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:12:18 crc kubenswrapper[4669]: E1008 21:12:18.331483 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.348171 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3975f8d5-05d5-43bf-a697-7592cae00f76" path="/var/lib/kubelet/pods/3975f8d5-05d5-43bf-a697-7592cae00f76/volumes" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.349580 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="600b70ca-92ae-481c-96a7-e1ad051b1a1a" path="/var/lib/kubelet/pods/600b70ca-92ae-481c-96a7-e1ad051b1a1a/volumes" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.350802 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e0af6a-a86e-4f7f-b00a-74f84b2eabae" path="/var/lib/kubelet/pods/d2e0af6a-a86e-4f7f-b00a-74f84b2eabae/volumes" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.729359 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wt2j8"] Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.731100 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.752200 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wt2j8"] Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.907644 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-catalog-content\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.907876 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkr2\" (UniqueName: \"kubernetes.io/projected/b3d11ce7-a33c-492a-a0a5-987a94cb9346-kube-api-access-cgkr2\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:19 crc kubenswrapper[4669]: I1008 21:12:19.908038 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-utilities\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.010252 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkr2\" (UniqueName: \"kubernetes.io/projected/b3d11ce7-a33c-492a-a0a5-987a94cb9346-kube-api-access-cgkr2\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.010343 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-utilities\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.010398 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-catalog-content\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.011049 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-catalog-content\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.011151 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-utilities\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.048894 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkr2\" (UniqueName: \"kubernetes.io/projected/b3d11ce7-a33c-492a-a0a5-987a94cb9346-kube-api-access-cgkr2\") pod \"redhat-operators-wt2j8\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.058646 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.512968 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wt2j8"] Oct 08 21:12:20 crc kubenswrapper[4669]: I1008 21:12:20.700752 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerStarted","Data":"4a27282b3e2edfcb55473fa9a75ab93126b26a3abe6d9dda04bd9caf0ae9010d"} Oct 08 21:12:21 crc kubenswrapper[4669]: I1008 21:12:21.713917 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerID="aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e" exitCode=0 Oct 08 21:12:21 crc kubenswrapper[4669]: I1008 21:12:21.713964 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerDied","Data":"aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e"} Oct 08 21:12:22 crc kubenswrapper[4669]: I1008 21:12:22.726816 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerStarted","Data":"c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d"} Oct 08 21:12:23 crc kubenswrapper[4669]: I1008 21:12:23.741163 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerID="c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d" exitCode=0 Oct 08 21:12:23 crc kubenswrapper[4669]: I1008 21:12:23.741300 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerDied","Data":"c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d"} Oct 08 21:12:24 crc kubenswrapper[4669]: I1008 21:12:24.753662 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerStarted","Data":"7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e"} Oct 08 21:12:24 crc kubenswrapper[4669]: I1008 21:12:24.778867 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wt2j8" podStartSLOduration=2.965374273 podStartE2EDuration="5.778845644s" podCreationTimestamp="2025-10-08 21:12:19 +0000 UTC" firstStartedPulling="2025-10-08 21:12:21.715575991 +0000 UTC m=+1661.408386664" lastFinishedPulling="2025-10-08 21:12:24.529047342 +0000 UTC m=+1664.221858035" observedRunningTime="2025-10-08 21:12:24.771903652 +0000 UTC m=+1664.464714325" watchObservedRunningTime="2025-10-08 21:12:24.778845644 +0000 UTC m=+1664.471656317" Oct 08 21:12:30 crc kubenswrapper[4669]: I1008 21:12:30.060202 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:30 crc kubenswrapper[4669]: I1008 21:12:30.060913 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:30 crc kubenswrapper[4669]: I1008 21:12:30.112471 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:30 crc kubenswrapper[4669]: I1008 21:12:30.857688 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:31 crc kubenswrapper[4669]: I1008 21:12:31.341955 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:12:31 crc kubenswrapper[4669]: E1008 21:12:31.342245 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.121339 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wt2j8"] Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.123339 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wt2j8" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="registry-server" containerID="cri-o://7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e" gracePeriod=2 Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.564350 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.662820 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-utilities\") pod \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.662971 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgkr2\" (UniqueName: \"kubernetes.io/projected/b3d11ce7-a33c-492a-a0a5-987a94cb9346-kube-api-access-cgkr2\") pod \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.663096 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-catalog-content\") pod \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\" (UID: \"b3d11ce7-a33c-492a-a0a5-987a94cb9346\") " Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.663886 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-utilities" (OuterVolumeSpecName: "utilities") pod "b3d11ce7-a33c-492a-a0a5-987a94cb9346" (UID: "b3d11ce7-a33c-492a-a0a5-987a94cb9346"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.669617 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d11ce7-a33c-492a-a0a5-987a94cb9346-kube-api-access-cgkr2" (OuterVolumeSpecName: "kube-api-access-cgkr2") pod "b3d11ce7-a33c-492a-a0a5-987a94cb9346" (UID: "b3d11ce7-a33c-492a-a0a5-987a94cb9346"). InnerVolumeSpecName "kube-api-access-cgkr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.765650 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.765683 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgkr2\" (UniqueName: \"kubernetes.io/projected/b3d11ce7-a33c-492a-a0a5-987a94cb9346-kube-api-access-cgkr2\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.780234 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d11ce7-a33c-492a-a0a5-987a94cb9346" (UID: "b3d11ce7-a33c-492a-a0a5-987a94cb9346"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.848709 4669 generic.go:334] "Generic (PLEG): container finished" podID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerID="7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e" exitCode=0 Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.848756 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerDied","Data":"7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e"} Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.848772 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wt2j8" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.848799 4669 scope.go:117] "RemoveContainer" containerID="7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.848782 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wt2j8" event={"ID":"b3d11ce7-a33c-492a-a0a5-987a94cb9346","Type":"ContainerDied","Data":"4a27282b3e2edfcb55473fa9a75ab93126b26a3abe6d9dda04bd9caf0ae9010d"} Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.867168 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d11ce7-a33c-492a-a0a5-987a94cb9346-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.882750 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wt2j8"] Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.889833 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wt2j8"] Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.891445 4669 scope.go:117] "RemoveContainer" containerID="c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.909876 4669 scope.go:117] "RemoveContainer" containerID="aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.952246 4669 scope.go:117] "RemoveContainer" containerID="7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e" Oct 08 21:12:33 crc kubenswrapper[4669]: E1008 21:12:33.952727 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e\": container with ID starting with 7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e not found: ID does not exist" containerID="7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.952784 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e"} err="failed to get container status \"7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e\": rpc error: code = NotFound desc = could not find container \"7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e\": container with ID starting with 7bc1480eedf117080d489fb8e7086f105fed0f77655d694bc69a427e6eed893e not found: ID does not exist" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.952823 4669 scope.go:117] "RemoveContainer" containerID="c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d" Oct 08 21:12:33 crc kubenswrapper[4669]: E1008 21:12:33.953190 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d\": container with ID starting with c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d not found: ID does not exist" containerID="c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.953222 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d"} err="failed to get container status \"c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d\": rpc error: code = NotFound desc = could not find container \"c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d\": container with ID starting with c798568be0a6ff681f2a6c3984566b7dc5d0b9824a6e1cd85aef1ef3dac3030d not found: ID does not exist" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.953241 4669 scope.go:117] "RemoveContainer" containerID="aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e" Oct 08 21:12:33 crc kubenswrapper[4669]: E1008 21:12:33.953520 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e\": container with ID starting with aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e not found: ID does not exist" containerID="aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e" Oct 08 21:12:33 crc kubenswrapper[4669]: I1008 21:12:33.953566 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e"} err="failed to get container status \"aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e\": rpc error: code = NotFound desc = could not find container \"aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e\": container with ID starting with aac4f1bc7e03a70b35de2053116a5f013f6abef0a4bb7a0e9a9f08cec44f336e not found: ID does not exist" Oct 08 21:12:35 crc kubenswrapper[4669]: I1008 21:12:35.343368 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" path="/var/lib/kubelet/pods/b3d11ce7-a33c-492a-a0a5-987a94cb9346/volumes" Oct 08 21:12:41 crc kubenswrapper[4669]: I1008 21:12:41.065636 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-28wfx"] Oct 08 21:12:41 crc kubenswrapper[4669]: I1008 21:12:41.078964 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-28wfx"] Oct 08 21:12:41 crc kubenswrapper[4669]: I1008 21:12:41.365542 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe4f9a2-f7a7-433e-a6e9-4811d2424259" path="/var/lib/kubelet/pods/1fe4f9a2-f7a7-433e-a6e9-4811d2424259/volumes" Oct 08 21:12:41 crc kubenswrapper[4669]: I1008 21:12:41.931094 4669 generic.go:334] "Generic (PLEG): container finished" podID="30bc939f-3290-4c46-8d00-120c0bf33951" containerID="ceb24fb6a70f68ee58725ac57baa74d9eee93167cbadf067c3fe3c5d0ad55230" exitCode=0 Oct 08 21:12:41 crc kubenswrapper[4669]: I1008 21:12:41.931196 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" event={"ID":"30bc939f-3290-4c46-8d00-120c0bf33951","Type":"ContainerDied","Data":"ceb24fb6a70f68ee58725ac57baa74d9eee93167cbadf067c3fe3c5d0ad55230"} Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.377492 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.456627 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-inventory\") pod \"30bc939f-3290-4c46-8d00-120c0bf33951\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.456826 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-ssh-key\") pod \"30bc939f-3290-4c46-8d00-120c0bf33951\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.456869 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqz2h\" (UniqueName: \"kubernetes.io/projected/30bc939f-3290-4c46-8d00-120c0bf33951-kube-api-access-nqz2h\") pod \"30bc939f-3290-4c46-8d00-120c0bf33951\" (UID: \"30bc939f-3290-4c46-8d00-120c0bf33951\") " Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.461779 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30bc939f-3290-4c46-8d00-120c0bf33951-kube-api-access-nqz2h" (OuterVolumeSpecName: "kube-api-access-nqz2h") pod "30bc939f-3290-4c46-8d00-120c0bf33951" (UID: "30bc939f-3290-4c46-8d00-120c0bf33951"). InnerVolumeSpecName "kube-api-access-nqz2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.482092 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "30bc939f-3290-4c46-8d00-120c0bf33951" (UID: "30bc939f-3290-4c46-8d00-120c0bf33951"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.493002 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-inventory" (OuterVolumeSpecName: "inventory") pod "30bc939f-3290-4c46-8d00-120c0bf33951" (UID: "30bc939f-3290-4c46-8d00-120c0bf33951"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.559212 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.559237 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30bc939f-3290-4c46-8d00-120c0bf33951-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.559248 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqz2h\" (UniqueName: \"kubernetes.io/projected/30bc939f-3290-4c46-8d00-120c0bf33951-kube-api-access-nqz2h\") on node \"crc\" DevicePath \"\"" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.958328 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" event={"ID":"30bc939f-3290-4c46-8d00-120c0bf33951","Type":"ContainerDied","Data":"24b7a05038503d3c36640afbb8343d616a53a5599a2e6d7e25a0ef459f337264"} Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.958763 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b7a05038503d3c36640afbb8343d616a53a5599a2e6d7e25a0ef459f337264" Oct 08 21:12:43 crc kubenswrapper[4669]: I1008 21:12:43.958828 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hk4fz" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.067036 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff"] Oct 08 21:12:44 crc kubenswrapper[4669]: E1008 21:12:44.067658 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="extract-content" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.067691 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="extract-content" Oct 08 21:12:44 crc kubenswrapper[4669]: E1008 21:12:44.067730 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30bc939f-3290-4c46-8d00-120c0bf33951" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.067744 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="30bc939f-3290-4c46-8d00-120c0bf33951" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:12:44 crc kubenswrapper[4669]: E1008 21:12:44.067772 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="extract-utilities" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.067798 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="extract-utilities" Oct 08 21:12:44 crc kubenswrapper[4669]: E1008 21:12:44.067832 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="registry-server" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.067847 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="registry-server" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.068313 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="30bc939f-3290-4c46-8d00-120c0bf33951" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.068351 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d11ce7-a33c-492a-a0a5-987a94cb9346" containerName="registry-server" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.069600 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.071340 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.072344 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.072842 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.072902 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.079809 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff"] Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.177688 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qppx\" (UniqueName: \"kubernetes.io/projected/1d09eff3-6572-462c-91db-4a1f5f167eae-kube-api-access-7qppx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.177783 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.177823 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.279828 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.280090 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.280325 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qppx\" (UniqueName: \"kubernetes.io/projected/1d09eff3-6572-462c-91db-4a1f5f167eae-kube-api-access-7qppx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.286055 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.286738 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.298743 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qppx\" (UniqueName: \"kubernetes.io/projected/1d09eff3-6572-462c-91db-4a1f5f167eae-kube-api-access-7qppx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6xbff\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.400407 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.941068 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff"] Oct 08 21:12:44 crc kubenswrapper[4669]: I1008 21:12:44.968197 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" event={"ID":"1d09eff3-6572-462c-91db-4a1f5f167eae","Type":"ContainerStarted","Data":"fbeea9fe99529da72e79e5522ec5a2134f739e9bfdb15078eea6fc4296f46057"} Oct 08 21:12:45 crc kubenswrapper[4669]: I1008 21:12:45.331161 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:12:45 crc kubenswrapper[4669]: E1008 21:12:45.332731 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:12:45 crc kubenswrapper[4669]: I1008 21:12:45.983848 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" event={"ID":"1d09eff3-6572-462c-91db-4a1f5f167eae","Type":"ContainerStarted","Data":"bfe0c4049f43771afafad14774290f770d3f739676ac76e3a90f8000832dba98"} Oct 08 21:12:46 crc kubenswrapper[4669]: I1008 21:12:46.013611 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" podStartSLOduration=1.529294391 podStartE2EDuration="2.013592881s" podCreationTimestamp="2025-10-08 21:12:44 +0000 UTC" firstStartedPulling="2025-10-08 21:12:44.937300362 +0000 UTC m=+1684.630111035" lastFinishedPulling="2025-10-08 21:12:45.421598852 +0000 UTC m=+1685.114409525" observedRunningTime="2025-10-08 21:12:46.012188213 +0000 UTC m=+1685.704998886" watchObservedRunningTime="2025-10-08 21:12:46.013592881 +0000 UTC m=+1685.706403554" Oct 08 21:12:56 crc kubenswrapper[4669]: I1008 21:12:56.331628 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:12:56 crc kubenswrapper[4669]: E1008 21:12:56.334632 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.224652 4669 scope.go:117] "RemoveContainer" containerID="e912c43cb389f4c8db6b53379d9a67dec9c39a35ffaa395cd65d08f7a4316b7f" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.248087 4669 scope.go:117] "RemoveContainer" containerID="c5f8501003b90fb31b4d9162158d1b12d19c733da1a1439711057c3d576096f0" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.291102 4669 scope.go:117] "RemoveContainer" containerID="d1615ca45806c11e63a66a795bded109fa28ef3aae30af9a8081028f5fb5fe78" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.333786 4669 scope.go:117] "RemoveContainer" containerID="619fbe96a4ba98a3d3b55c3917a60b81c986aa0d85cbe55de0dcccf3ed1aafe6" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.374720 4669 scope.go:117] "RemoveContainer" containerID="0b3ddbeb7c3054f12a71af868748cddd1dc97afa361e7047de8b21e5eab29011" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.435381 4669 scope.go:117] "RemoveContainer" containerID="edca01ffd40f9f0e1d482317359c5dd3913d1fd06fe915116a3297756d7edac1" Oct 08 21:12:58 crc kubenswrapper[4669]: I1008 21:12:58.455454 4669 scope.go:117] "RemoveContainer" containerID="47c2d4a9b75cdf000f25a82826567661d74f9443d269eb51bfecf4852e9aeed0" Oct 08 21:13:04 crc kubenswrapper[4669]: I1008 21:13:04.049344 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6p9j"] Oct 08 21:13:04 crc kubenswrapper[4669]: I1008 21:13:04.059427 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-295kq"] Oct 08 21:13:04 crc kubenswrapper[4669]: I1008 21:13:04.076257 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6p9j"] Oct 08 21:13:04 crc kubenswrapper[4669]: I1008 21:13:04.088815 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-295kq"] Oct 08 21:13:05 crc kubenswrapper[4669]: I1008 21:13:05.351794 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38420235-f71d-4b0a-95ed-d4c86a23e44b" path="/var/lib/kubelet/pods/38420235-f71d-4b0a-95ed-d4c86a23e44b/volumes" Oct 08 21:13:05 crc kubenswrapper[4669]: I1008 21:13:05.354080 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f51ff62e-2d14-4205-acb5-1ae440525941" path="/var/lib/kubelet/pods/f51ff62e-2d14-4205-acb5-1ae440525941/volumes" Oct 08 21:13:11 crc kubenswrapper[4669]: I1008 21:13:11.336502 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:13:11 crc kubenswrapper[4669]: E1008 21:13:11.337505 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:13:22 crc kubenswrapper[4669]: I1008 21:13:22.331069 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:13:22 crc kubenswrapper[4669]: E1008 21:13:22.331934 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:13:33 crc kubenswrapper[4669]: I1008 21:13:33.331225 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:13:33 crc kubenswrapper[4669]: E1008 21:13:33.332136 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:13:40 crc kubenswrapper[4669]: I1008 21:13:40.561051 4669 generic.go:334] "Generic (PLEG): container finished" podID="1d09eff3-6572-462c-91db-4a1f5f167eae" containerID="bfe0c4049f43771afafad14774290f770d3f739676ac76e3a90f8000832dba98" exitCode=2 Oct 08 21:13:40 crc kubenswrapper[4669]: I1008 21:13:40.561307 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" event={"ID":"1d09eff3-6572-462c-91db-4a1f5f167eae","Type":"ContainerDied","Data":"bfe0c4049f43771afafad14774290f770d3f739676ac76e3a90f8000832dba98"} Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.008627 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.070771 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-ssh-key\") pod \"1d09eff3-6572-462c-91db-4a1f5f167eae\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.070926 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-inventory\") pod \"1d09eff3-6572-462c-91db-4a1f5f167eae\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.071026 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qppx\" (UniqueName: \"kubernetes.io/projected/1d09eff3-6572-462c-91db-4a1f5f167eae-kube-api-access-7qppx\") pod \"1d09eff3-6572-462c-91db-4a1f5f167eae\" (UID: \"1d09eff3-6572-462c-91db-4a1f5f167eae\") " Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.078221 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d09eff3-6572-462c-91db-4a1f5f167eae-kube-api-access-7qppx" (OuterVolumeSpecName: "kube-api-access-7qppx") pod "1d09eff3-6572-462c-91db-4a1f5f167eae" (UID: "1d09eff3-6572-462c-91db-4a1f5f167eae"). InnerVolumeSpecName "kube-api-access-7qppx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.099177 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-inventory" (OuterVolumeSpecName: "inventory") pod "1d09eff3-6572-462c-91db-4a1f5f167eae" (UID: "1d09eff3-6572-462c-91db-4a1f5f167eae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.100566 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1d09eff3-6572-462c-91db-4a1f5f167eae" (UID: "1d09eff3-6572-462c-91db-4a1f5f167eae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.172572 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.172599 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d09eff3-6572-462c-91db-4a1f5f167eae-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.172609 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qppx\" (UniqueName: \"kubernetes.io/projected/1d09eff3-6572-462c-91db-4a1f5f167eae-kube-api-access-7qppx\") on node \"crc\" DevicePath \"\"" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.584884 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" event={"ID":"1d09eff3-6572-462c-91db-4a1f5f167eae","Type":"ContainerDied","Data":"fbeea9fe99529da72e79e5522ec5a2134f739e9bfdb15078eea6fc4296f46057"} Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.584917 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbeea9fe99529da72e79e5522ec5a2134f739e9bfdb15078eea6fc4296f46057" Oct 08 21:13:42 crc kubenswrapper[4669]: I1008 21:13:42.584976 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6xbff" Oct 08 21:13:46 crc kubenswrapper[4669]: I1008 21:13:46.331805 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:13:46 crc kubenswrapper[4669]: E1008 21:13:46.332851 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:13:48 crc kubenswrapper[4669]: I1008 21:13:48.053243 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8p7zt"] Oct 08 21:13:48 crc kubenswrapper[4669]: I1008 21:13:48.065660 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8p7zt"] Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.048153 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4"] Oct 08 21:13:49 crc kubenswrapper[4669]: E1008 21:13:49.049044 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d09eff3-6572-462c-91db-4a1f5f167eae" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.049075 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d09eff3-6572-462c-91db-4a1f5f167eae" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.049471 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d09eff3-6572-462c-91db-4a1f5f167eae" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.050655 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.055355 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.055957 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.060154 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.068697 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.090123 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4"] Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.228131 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.228694 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.228847 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqbk\" (UniqueName: \"kubernetes.io/projected/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-kube-api-access-rrqbk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.330926 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.331294 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.331408 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqbk\" (UniqueName: \"kubernetes.io/projected/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-kube-api-access-rrqbk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.345137 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.345313 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.354866 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c" path="/var/lib/kubelet/pods/7f4865f8-c1fc-4a06-a7e1-9a40b90cca2c/volumes" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.362283 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqbk\" (UniqueName: \"kubernetes.io/projected/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-kube-api-access-rrqbk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tgph4\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.382268 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:13:49 crc kubenswrapper[4669]: I1008 21:13:49.974790 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4"] Oct 08 21:13:50 crc kubenswrapper[4669]: I1008 21:13:50.672150 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" event={"ID":"e78c98eb-ee71-4877-92f0-edfa9eaca8e5","Type":"ContainerStarted","Data":"82c4e125539f9dfa9076d831fa853f4ff3ae410112f28ab7b7f4f587606bf3c9"} Oct 08 21:13:51 crc kubenswrapper[4669]: I1008 21:13:51.684797 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" event={"ID":"e78c98eb-ee71-4877-92f0-edfa9eaca8e5","Type":"ContainerStarted","Data":"028ac55b56ff6ff9863d90fdc7ded249c1a80abc30265eb5cf9c4cb02cdd2789"} Oct 08 21:13:51 crc kubenswrapper[4669]: I1008 21:13:51.723506 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" podStartSLOduration=2.30950072 podStartE2EDuration="2.72347447s" podCreationTimestamp="2025-10-08 21:13:49 +0000 UTC" firstStartedPulling="2025-10-08 21:13:49.985357933 +0000 UTC m=+1749.678168606" lastFinishedPulling="2025-10-08 21:13:50.399331683 +0000 UTC m=+1750.092142356" observedRunningTime="2025-10-08 21:13:51.712893607 +0000 UTC m=+1751.405704320" watchObservedRunningTime="2025-10-08 21:13:51.72347447 +0000 UTC m=+1751.416285183" Oct 08 21:13:58 crc kubenswrapper[4669]: I1008 21:13:58.625096 4669 scope.go:117] "RemoveContainer" containerID="abd9f109a524eacf76eb809d428126fda858bae591a1abadc18528b1791e5682" Oct 08 21:13:58 crc kubenswrapper[4669]: I1008 21:13:58.665311 4669 scope.go:117] "RemoveContainer" containerID="85829c24cdb7bf9fb518f37474e69d1fd3d00723a5bbeb1c7426c0e30629c33d" Oct 08 21:13:58 crc kubenswrapper[4669]: I1008 21:13:58.723453 4669 scope.go:117] "RemoveContainer" containerID="83b49b92310d005ab3c938c263f2aa57b56a0ef10348756cd3d16e14bc67e1cc" Oct 08 21:14:01 crc kubenswrapper[4669]: I1008 21:14:01.337473 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:14:01 crc kubenswrapper[4669]: E1008 21:14:01.337999 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:14:16 crc kubenswrapper[4669]: I1008 21:14:16.331572 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:14:16 crc kubenswrapper[4669]: I1008 21:14:16.939924 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"6d65055a3906618fe836fadf5c57a56b63f93978b81babdb5d1fe796cee4cdf9"} Oct 08 21:14:38 crc kubenswrapper[4669]: I1008 21:14:38.118367 4669 generic.go:334] "Generic (PLEG): container finished" podID="e78c98eb-ee71-4877-92f0-edfa9eaca8e5" containerID="028ac55b56ff6ff9863d90fdc7ded249c1a80abc30265eb5cf9c4cb02cdd2789" exitCode=0 Oct 08 21:14:38 crc kubenswrapper[4669]: I1008 21:14:38.118428 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" event={"ID":"e78c98eb-ee71-4877-92f0-edfa9eaca8e5","Type":"ContainerDied","Data":"028ac55b56ff6ff9863d90fdc7ded249c1a80abc30265eb5cf9c4cb02cdd2789"} Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.544280 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.638688 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrqbk\" (UniqueName: \"kubernetes.io/projected/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-kube-api-access-rrqbk\") pod \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.638894 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-inventory\") pod \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.639024 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-ssh-key\") pod \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\" (UID: \"e78c98eb-ee71-4877-92f0-edfa9eaca8e5\") " Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.646461 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-kube-api-access-rrqbk" (OuterVolumeSpecName: "kube-api-access-rrqbk") pod "e78c98eb-ee71-4877-92f0-edfa9eaca8e5" (UID: "e78c98eb-ee71-4877-92f0-edfa9eaca8e5"). InnerVolumeSpecName "kube-api-access-rrqbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.676728 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-inventory" (OuterVolumeSpecName: "inventory") pod "e78c98eb-ee71-4877-92f0-edfa9eaca8e5" (UID: "e78c98eb-ee71-4877-92f0-edfa9eaca8e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.690005 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e78c98eb-ee71-4877-92f0-edfa9eaca8e5" (UID: "e78c98eb-ee71-4877-92f0-edfa9eaca8e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.742343 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.742399 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrqbk\" (UniqueName: \"kubernetes.io/projected/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-kube-api-access-rrqbk\") on node \"crc\" DevicePath \"\"" Oct 08 21:14:39 crc kubenswrapper[4669]: I1008 21:14:39.742426 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78c98eb-ee71-4877-92f0-edfa9eaca8e5-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.141054 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" event={"ID":"e78c98eb-ee71-4877-92f0-edfa9eaca8e5","Type":"ContainerDied","Data":"82c4e125539f9dfa9076d831fa853f4ff3ae410112f28ab7b7f4f587606bf3c9"} Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.141100 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82c4e125539f9dfa9076d831fa853f4ff3ae410112f28ab7b7f4f587606bf3c9" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.141482 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tgph4" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.234415 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m6tgw"] Oct 08 21:14:40 crc kubenswrapper[4669]: E1008 21:14:40.234881 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78c98eb-ee71-4877-92f0-edfa9eaca8e5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.234909 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78c98eb-ee71-4877-92f0-edfa9eaca8e5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.235100 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78c98eb-ee71-4877-92f0-edfa9eaca8e5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.235775 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.238651 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.240238 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.247163 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.247482 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.248822 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m6tgw"] Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.252805 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcn74\" (UniqueName: \"kubernetes.io/projected/f598594a-891b-4339-99d1-e10f7c3844af-kube-api-access-pcn74\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.252931 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.253187 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.354375 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcn74\" (UniqueName: \"kubernetes.io/projected/f598594a-891b-4339-99d1-e10f7c3844af-kube-api-access-pcn74\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.354713 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.354822 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.358382 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.359455 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.369348 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcn74\" (UniqueName: \"kubernetes.io/projected/f598594a-891b-4339-99d1-e10f7c3844af-kube-api-access-pcn74\") pod \"ssh-known-hosts-edpm-deployment-m6tgw\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:40 crc kubenswrapper[4669]: I1008 21:14:40.557987 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:41 crc kubenswrapper[4669]: I1008 21:14:41.140709 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m6tgw"] Oct 08 21:14:41 crc kubenswrapper[4669]: I1008 21:14:41.153201 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" event={"ID":"f598594a-891b-4339-99d1-e10f7c3844af","Type":"ContainerStarted","Data":"63f4517afd68746b941925eea39245719a1414b96f5dba535836b09056f16484"} Oct 08 21:14:41 crc kubenswrapper[4669]: I1008 21:14:41.797136 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:14:42 crc kubenswrapper[4669]: I1008 21:14:42.161937 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" event={"ID":"f598594a-891b-4339-99d1-e10f7c3844af","Type":"ContainerStarted","Data":"914e8f43483d8f950b1d73a83e7e706d59f9ce56de39165316d68eb2bcba37ae"} Oct 08 21:14:42 crc kubenswrapper[4669]: I1008 21:14:42.190127 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" podStartSLOduration=1.531970763 podStartE2EDuration="2.190106506s" podCreationTimestamp="2025-10-08 21:14:40 +0000 UTC" firstStartedPulling="2025-10-08 21:14:41.13624842 +0000 UTC m=+1800.829059133" lastFinishedPulling="2025-10-08 21:14:41.794384203 +0000 UTC m=+1801.487194876" observedRunningTime="2025-10-08 21:14:42.178870738 +0000 UTC m=+1801.871681421" watchObservedRunningTime="2025-10-08 21:14:42.190106506 +0000 UTC m=+1801.882917179" Oct 08 21:14:49 crc kubenswrapper[4669]: I1008 21:14:49.221957 4669 generic.go:334] "Generic (PLEG): container finished" podID="f598594a-891b-4339-99d1-e10f7c3844af" containerID="914e8f43483d8f950b1d73a83e7e706d59f9ce56de39165316d68eb2bcba37ae" exitCode=0 Oct 08 21:14:49 crc kubenswrapper[4669]: I1008 21:14:49.222000 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" event={"ID":"f598594a-891b-4339-99d1-e10f7c3844af","Type":"ContainerDied","Data":"914e8f43483d8f950b1d73a83e7e706d59f9ce56de39165316d68eb2bcba37ae"} Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.642256 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.746817 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcn74\" (UniqueName: \"kubernetes.io/projected/f598594a-891b-4339-99d1-e10f7c3844af-kube-api-access-pcn74\") pod \"f598594a-891b-4339-99d1-e10f7c3844af\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.746888 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-inventory-0\") pod \"f598594a-891b-4339-99d1-e10f7c3844af\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.746918 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-ssh-key-openstack-edpm-ipam\") pod \"f598594a-891b-4339-99d1-e10f7c3844af\" (UID: \"f598594a-891b-4339-99d1-e10f7c3844af\") " Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.756607 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f598594a-891b-4339-99d1-e10f7c3844af-kube-api-access-pcn74" (OuterVolumeSpecName: "kube-api-access-pcn74") pod "f598594a-891b-4339-99d1-e10f7c3844af" (UID: "f598594a-891b-4339-99d1-e10f7c3844af"). InnerVolumeSpecName "kube-api-access-pcn74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.799041 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f598594a-891b-4339-99d1-e10f7c3844af" (UID: "f598594a-891b-4339-99d1-e10f7c3844af"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.807767 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f598594a-891b-4339-99d1-e10f7c3844af" (UID: "f598594a-891b-4339-99d1-e10f7c3844af"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.849701 4669 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.849738 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f598594a-891b-4339-99d1-e10f7c3844af-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 08 21:14:50 crc kubenswrapper[4669]: I1008 21:14:50.849750 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcn74\" (UniqueName: \"kubernetes.io/projected/f598594a-891b-4339-99d1-e10f7c3844af-kube-api-access-pcn74\") on node \"crc\" DevicePath \"\"" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.243703 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" event={"ID":"f598594a-891b-4339-99d1-e10f7c3844af","Type":"ContainerDied","Data":"63f4517afd68746b941925eea39245719a1414b96f5dba535836b09056f16484"} Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.243746 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63f4517afd68746b941925eea39245719a1414b96f5dba535836b09056f16484" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.243780 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m6tgw" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.307763 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595"] Oct 08 21:14:51 crc kubenswrapper[4669]: E1008 21:14:51.308125 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f598594a-891b-4339-99d1-e10f7c3844af" containerName="ssh-known-hosts-edpm-deployment" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.308141 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f598594a-891b-4339-99d1-e10f7c3844af" containerName="ssh-known-hosts-edpm-deployment" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.308331 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f598594a-891b-4339-99d1-e10f7c3844af" containerName="ssh-known-hosts-edpm-deployment" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.308939 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.310605 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.310823 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.311034 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.311186 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.321719 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595"] Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.461992 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw4hc\" (UniqueName: \"kubernetes.io/projected/2557fef7-913e-4188-93e7-4a60c4b4c918-kube-api-access-fw4hc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.462148 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.462341 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.563941 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw4hc\" (UniqueName: \"kubernetes.io/projected/2557fef7-913e-4188-93e7-4a60c4b4c918-kube-api-access-fw4hc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.564035 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.564092 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.569412 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.569437 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.581426 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw4hc\" (UniqueName: \"kubernetes.io/projected/2557fef7-913e-4188-93e7-4a60c4b4c918-kube-api-access-fw4hc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cl595\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:51 crc kubenswrapper[4669]: I1008 21:14:51.629153 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:14:52 crc kubenswrapper[4669]: I1008 21:14:52.250795 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595"] Oct 08 21:14:53 crc kubenswrapper[4669]: I1008 21:14:53.263889 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" event={"ID":"2557fef7-913e-4188-93e7-4a60c4b4c918","Type":"ContainerStarted","Data":"55f0593ce4922f58234f6b7187e3e65ddde1b39e6767fc6aa7ecde82e9b30f8a"} Oct 08 21:14:53 crc kubenswrapper[4669]: I1008 21:14:53.264320 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" event={"ID":"2557fef7-913e-4188-93e7-4a60c4b4c918","Type":"ContainerStarted","Data":"e1b18b728e1153f44df1e4caba54a8bf0cb6edd6973a6ab77b3817e53bfafbf0"} Oct 08 21:14:53 crc kubenswrapper[4669]: I1008 21:14:53.284552 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" podStartSLOduration=1.849343644 podStartE2EDuration="2.284513468s" podCreationTimestamp="2025-10-08 21:14:51 +0000 UTC" firstStartedPulling="2025-10-08 21:14:52.25615533 +0000 UTC m=+1811.948966013" lastFinishedPulling="2025-10-08 21:14:52.691325164 +0000 UTC m=+1812.384135837" observedRunningTime="2025-10-08 21:14:53.279045658 +0000 UTC m=+1812.971856331" watchObservedRunningTime="2025-10-08 21:14:53.284513468 +0000 UTC m=+1812.977324141" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.148138 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz"] Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.149854 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.151960 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.152037 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.179303 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz"] Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.239331 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-config-volume\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.239674 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrd78\" (UniqueName: \"kubernetes.io/projected/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-kube-api-access-rrd78\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.239789 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-secret-volume\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.341431 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrd78\" (UniqueName: \"kubernetes.io/projected/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-kube-api-access-rrd78\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.341551 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-secret-volume\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.341699 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-config-volume\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.342805 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-config-volume\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.347754 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-secret-volume\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.368194 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrd78\" (UniqueName: \"kubernetes.io/projected/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-kube-api-access-rrd78\") pod \"collect-profiles-29332635-49ptz\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.470177 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:00 crc kubenswrapper[4669]: I1008 21:15:00.954412 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz"] Oct 08 21:15:01 crc kubenswrapper[4669]: I1008 21:15:01.329240 4669 generic.go:334] "Generic (PLEG): container finished" podID="2557fef7-913e-4188-93e7-4a60c4b4c918" containerID="55f0593ce4922f58234f6b7187e3e65ddde1b39e6767fc6aa7ecde82e9b30f8a" exitCode=0 Oct 08 21:15:01 crc kubenswrapper[4669]: I1008 21:15:01.329301 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" event={"ID":"2557fef7-913e-4188-93e7-4a60c4b4c918","Type":"ContainerDied","Data":"55f0593ce4922f58234f6b7187e3e65ddde1b39e6767fc6aa7ecde82e9b30f8a"} Oct 08 21:15:01 crc kubenswrapper[4669]: I1008 21:15:01.369911 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" event={"ID":"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c","Type":"ContainerStarted","Data":"a9ccf28d1ee09100013d5af37545789d46b08c3d8ba94f550d64ef05da8d3db3"} Oct 08 21:15:01 crc kubenswrapper[4669]: I1008 21:15:01.369960 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" event={"ID":"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c","Type":"ContainerStarted","Data":"0db1ce98b5f6bfbecd4f93fd1f8a969a778c58b2e0ff4d767ff9d8ce2cf99842"} Oct 08 21:15:01 crc kubenswrapper[4669]: I1008 21:15:01.380737 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" podStartSLOduration=1.380725228 podStartE2EDuration="1.380725228s" podCreationTimestamp="2025-10-08 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:15:01.377085767 +0000 UTC m=+1821.069896430" watchObservedRunningTime="2025-10-08 21:15:01.380725228 +0000 UTC m=+1821.073535901" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.352685 4669 generic.go:334] "Generic (PLEG): container finished" podID="ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" containerID="a9ccf28d1ee09100013d5af37545789d46b08c3d8ba94f550d64ef05da8d3db3" exitCode=0 Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.353419 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" event={"ID":"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c","Type":"ContainerDied","Data":"a9ccf28d1ee09100013d5af37545789d46b08c3d8ba94f550d64ef05da8d3db3"} Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.766779 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.788174 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-inventory\") pod \"2557fef7-913e-4188-93e7-4a60c4b4c918\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.788284 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw4hc\" (UniqueName: \"kubernetes.io/projected/2557fef7-913e-4188-93e7-4a60c4b4c918-kube-api-access-fw4hc\") pod \"2557fef7-913e-4188-93e7-4a60c4b4c918\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.788474 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-ssh-key\") pod \"2557fef7-913e-4188-93e7-4a60c4b4c918\" (UID: \"2557fef7-913e-4188-93e7-4a60c4b4c918\") " Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.793734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2557fef7-913e-4188-93e7-4a60c4b4c918-kube-api-access-fw4hc" (OuterVolumeSpecName: "kube-api-access-fw4hc") pod "2557fef7-913e-4188-93e7-4a60c4b4c918" (UID: "2557fef7-913e-4188-93e7-4a60c4b4c918"). InnerVolumeSpecName "kube-api-access-fw4hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.821395 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2557fef7-913e-4188-93e7-4a60c4b4c918" (UID: "2557fef7-913e-4188-93e7-4a60c4b4c918"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.824675 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-inventory" (OuterVolumeSpecName: "inventory") pod "2557fef7-913e-4188-93e7-4a60c4b4c918" (UID: "2557fef7-913e-4188-93e7-4a60c4b4c918"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.890980 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.891240 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2557fef7-913e-4188-93e7-4a60c4b4c918-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:15:02 crc kubenswrapper[4669]: I1008 21:15:02.891252 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw4hc\" (UniqueName: \"kubernetes.io/projected/2557fef7-913e-4188-93e7-4a60c4b4c918-kube-api-access-fw4hc\") on node \"crc\" DevicePath \"\"" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.363308 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" event={"ID":"2557fef7-913e-4188-93e7-4a60c4b4c918","Type":"ContainerDied","Data":"e1b18b728e1153f44df1e4caba54a8bf0cb6edd6973a6ab77b3817e53bfafbf0"} Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.363344 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cl595" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.363349 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b18b728e1153f44df1e4caba54a8bf0cb6edd6973a6ab77b3817e53bfafbf0" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.728201 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.808476 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrd78\" (UniqueName: \"kubernetes.io/projected/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-kube-api-access-rrd78\") pod \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.808572 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-secret-volume\") pod \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.808878 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-config-volume\") pod \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\" (UID: \"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c\") " Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.809851 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-config-volume" (OuterVolumeSpecName: "config-volume") pod "ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" (UID: "ceda1af6-74e5-4e5f-af7e-6bd111b26c5c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.814115 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" (UID: "ceda1af6-74e5-4e5f-af7e-6bd111b26c5c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.814428 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-kube-api-access-rrd78" (OuterVolumeSpecName: "kube-api-access-rrd78") pod "ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" (UID: "ceda1af6-74e5-4e5f-af7e-6bd111b26c5c"). InnerVolumeSpecName "kube-api-access-rrd78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.851142 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk"] Oct 08 21:15:03 crc kubenswrapper[4669]: E1008 21:15:03.851509 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" containerName="collect-profiles" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.851532 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" containerName="collect-profiles" Oct 08 21:15:03 crc kubenswrapper[4669]: E1008 21:15:03.853583 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2557fef7-913e-4188-93e7-4a60c4b4c918" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.853606 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2557fef7-913e-4188-93e7-4a60c4b4c918" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.854180 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceda1af6-74e5-4e5f-af7e-6bd111b26c5c" containerName="collect-profiles" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.854212 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2557fef7-913e-4188-93e7-4a60c4b4c918" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.855618 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.859653 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.860204 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.860441 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.861217 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.863756 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk"] Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.910697 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnv4\" (UniqueName: \"kubernetes.io/projected/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-kube-api-access-4tnv4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.910838 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.910917 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.911003 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.911023 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrd78\" (UniqueName: \"kubernetes.io/projected/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-kube-api-access-rrd78\") on node \"crc\" DevicePath \"\"" Oct 08 21:15:03 crc kubenswrapper[4669]: I1008 21:15:03.911038 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceda1af6-74e5-4e5f-af7e-6bd111b26c5c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.013201 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.013278 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.013346 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnv4\" (UniqueName: \"kubernetes.io/projected/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-kube-api-access-4tnv4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.017572 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.020032 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.030555 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnv4\" (UniqueName: \"kubernetes.io/projected/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-kube-api-access-4tnv4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.208913 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.378130 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" event={"ID":"ceda1af6-74e5-4e5f-af7e-6bd111b26c5c","Type":"ContainerDied","Data":"0db1ce98b5f6bfbecd4f93fd1f8a969a778c58b2e0ff4d767ff9d8ce2cf99842"} Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.378721 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0db1ce98b5f6bfbecd4f93fd1f8a969a778c58b2e0ff4d767ff9d8ce2cf99842" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.378179 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332635-49ptz" Oct 08 21:15:04 crc kubenswrapper[4669]: I1008 21:15:04.711760 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk"] Oct 08 21:15:04 crc kubenswrapper[4669]: W1008 21:15:04.716702 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbc24f2_90eb_4f42_b2aa_0290921dbb79.slice/crio-508c8338de7beb3e3a74aa4c5ea405230f6892afaf4752699d7b00a158535515 WatchSource:0}: Error finding container 508c8338de7beb3e3a74aa4c5ea405230f6892afaf4752699d7b00a158535515: Status 404 returned error can't find the container with id 508c8338de7beb3e3a74aa4c5ea405230f6892afaf4752699d7b00a158535515 Oct 08 21:15:05 crc kubenswrapper[4669]: I1008 21:15:05.387661 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" event={"ID":"ffbc24f2-90eb-4f42-b2aa-0290921dbb79","Type":"ContainerStarted","Data":"508c8338de7beb3e3a74aa4c5ea405230f6892afaf4752699d7b00a158535515"} Oct 08 21:15:06 crc kubenswrapper[4669]: I1008 21:15:06.395771 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" event={"ID":"ffbc24f2-90eb-4f42-b2aa-0290921dbb79","Type":"ContainerStarted","Data":"8fff5405958e005b088cab91aa02b845f866aa3796b4c7c8b8a2182734baf56e"} Oct 08 21:15:06 crc kubenswrapper[4669]: I1008 21:15:06.417574 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" podStartSLOduration=2.963891947 podStartE2EDuration="3.41755068s" podCreationTimestamp="2025-10-08 21:15:03 +0000 UTC" firstStartedPulling="2025-10-08 21:15:04.720896032 +0000 UTC m=+1824.413706705" lastFinishedPulling="2025-10-08 21:15:05.174554765 +0000 UTC m=+1824.867365438" observedRunningTime="2025-10-08 21:15:06.410429242 +0000 UTC m=+1826.103239915" watchObservedRunningTime="2025-10-08 21:15:06.41755068 +0000 UTC m=+1826.110361363" Oct 08 21:16:21 crc kubenswrapper[4669]: I1008 21:16:21.172129 4669 generic.go:334] "Generic (PLEG): container finished" podID="ffbc24f2-90eb-4f42-b2aa-0290921dbb79" containerID="8fff5405958e005b088cab91aa02b845f866aa3796b4c7c8b8a2182734baf56e" exitCode=0 Oct 08 21:16:21 crc kubenswrapper[4669]: I1008 21:16:21.172315 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" event={"ID":"ffbc24f2-90eb-4f42-b2aa-0290921dbb79","Type":"ContainerDied","Data":"8fff5405958e005b088cab91aa02b845f866aa3796b4c7c8b8a2182734baf56e"} Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.620908 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.716499 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-ssh-key\") pod \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.716649 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tnv4\" (UniqueName: \"kubernetes.io/projected/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-kube-api-access-4tnv4\") pod \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.716674 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-inventory\") pod \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\" (UID: \"ffbc24f2-90eb-4f42-b2aa-0290921dbb79\") " Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.721551 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-kube-api-access-4tnv4" (OuterVolumeSpecName: "kube-api-access-4tnv4") pod "ffbc24f2-90eb-4f42-b2aa-0290921dbb79" (UID: "ffbc24f2-90eb-4f42-b2aa-0290921dbb79"). InnerVolumeSpecName "kube-api-access-4tnv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.741933 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-inventory" (OuterVolumeSpecName: "inventory") pod "ffbc24f2-90eb-4f42-b2aa-0290921dbb79" (UID: "ffbc24f2-90eb-4f42-b2aa-0290921dbb79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.744017 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffbc24f2-90eb-4f42-b2aa-0290921dbb79" (UID: "ffbc24f2-90eb-4f42-b2aa-0290921dbb79"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.820162 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.820239 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tnv4\" (UniqueName: \"kubernetes.io/projected/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-kube-api-access-4tnv4\") on node \"crc\" DevicePath \"\"" Oct 08 21:16:22 crc kubenswrapper[4669]: I1008 21:16:22.820267 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbc24f2-90eb-4f42-b2aa-0290921dbb79-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.194481 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" event={"ID":"ffbc24f2-90eb-4f42-b2aa-0290921dbb79","Type":"ContainerDied","Data":"508c8338de7beb3e3a74aa4c5ea405230f6892afaf4752699d7b00a158535515"} Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.194564 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508c8338de7beb3e3a74aa4c5ea405230f6892afaf4752699d7b00a158535515" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.194669 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.323896 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r"] Oct 08 21:16:23 crc kubenswrapper[4669]: E1008 21:16:23.324445 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbc24f2-90eb-4f42-b2aa-0290921dbb79" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.324476 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbc24f2-90eb-4f42-b2aa-0290921dbb79" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.324816 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbc24f2-90eb-4f42-b2aa-0290921dbb79" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.326469 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.335292 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.335447 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.335983 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.336047 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.335996 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.336153 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.336160 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.336185 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.340707 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r"] Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.436492 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.436603 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.436751 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.436914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.436955 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.436994 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437163 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437247 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437414 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437500 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437575 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437655 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437699 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds7lx\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-kube-api-access-ds7lx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.437942 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539291 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539617 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539644 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539683 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539712 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539732 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539751 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539781 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539804 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539833 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539868 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539896 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539922 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.539944 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds7lx\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-kube-api-access-ds7lx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.548518 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.549171 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.549577 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.549609 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.549860 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.549915 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.550247 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.550857 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.553380 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.554095 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.554694 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.556241 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.560465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.575212 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds7lx\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-kube-api-access-ds7lx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-72m9r\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:23 crc kubenswrapper[4669]: I1008 21:16:23.644786 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:16:24 crc kubenswrapper[4669]: I1008 21:16:24.281382 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r"] Oct 08 21:16:24 crc kubenswrapper[4669]: W1008 21:16:24.291745 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296bfc36_ce85_4db3_a692_acf7edf869b1.slice/crio-acc3ea6c8fa7d9b59a4e0c80025b239df45de2c3dd3c084203c7b44d63155288 WatchSource:0}: Error finding container acc3ea6c8fa7d9b59a4e0c80025b239df45de2c3dd3c084203c7b44d63155288: Status 404 returned error can't find the container with id acc3ea6c8fa7d9b59a4e0c80025b239df45de2c3dd3c084203c7b44d63155288 Oct 08 21:16:24 crc kubenswrapper[4669]: I1008 21:16:24.294140 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:16:25 crc kubenswrapper[4669]: I1008 21:16:25.212664 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" event={"ID":"296bfc36-ce85-4db3-a692-acf7edf869b1","Type":"ContainerStarted","Data":"390d53d371a314a43290e3d1a99e74784b118c96aed74ca747f9803fe5713527"} Oct 08 21:16:25 crc kubenswrapper[4669]: I1008 21:16:25.213239 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" event={"ID":"296bfc36-ce85-4db3-a692-acf7edf869b1","Type":"ContainerStarted","Data":"acc3ea6c8fa7d9b59a4e0c80025b239df45de2c3dd3c084203c7b44d63155288"} Oct 08 21:16:25 crc kubenswrapper[4669]: I1008 21:16:25.244004 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" podStartSLOduration=1.721555577 podStartE2EDuration="2.243976143s" podCreationTimestamp="2025-10-08 21:16:23 +0000 UTC" firstStartedPulling="2025-10-08 21:16:24.293851442 +0000 UTC m=+1903.986662125" lastFinishedPulling="2025-10-08 21:16:24.816272018 +0000 UTC m=+1904.509082691" observedRunningTime="2025-10-08 21:16:25.231005053 +0000 UTC m=+1904.923815766" watchObservedRunningTime="2025-10-08 21:16:25.243976143 +0000 UTC m=+1904.936786856" Oct 08 21:16:43 crc kubenswrapper[4669]: I1008 21:16:43.191833 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:16:43 crc kubenswrapper[4669]: I1008 21:16:43.192464 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:17:05 crc kubenswrapper[4669]: I1008 21:17:05.610565 4669 generic.go:334] "Generic (PLEG): container finished" podID="296bfc36-ce85-4db3-a692-acf7edf869b1" containerID="390d53d371a314a43290e3d1a99e74784b118c96aed74ca747f9803fe5713527" exitCode=0 Oct 08 21:17:05 crc kubenswrapper[4669]: I1008 21:17:05.610849 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" event={"ID":"296bfc36-ce85-4db3-a692-acf7edf869b1","Type":"ContainerDied","Data":"390d53d371a314a43290e3d1a99e74784b118c96aed74ca747f9803fe5713527"} Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.059011 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.104952 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105004 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ssh-key\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105046 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ovn-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105075 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-bootstrap-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105143 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-libvirt-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105180 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-repo-setup-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105215 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-nova-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105253 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105285 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds7lx\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-kube-api-access-ds7lx\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105323 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-inventory\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105365 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105402 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105435 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-telemetry-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.105475 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-neutron-metadata-combined-ca-bundle\") pod \"296bfc36-ce85-4db3-a692-acf7edf869b1\" (UID: \"296bfc36-ce85-4db3-a692-acf7edf869b1\") " Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.113510 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.115735 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.115742 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.115762 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.115833 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.115884 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.116445 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.117066 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.117972 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.118409 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.119306 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.119734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-kube-api-access-ds7lx" (OuterVolumeSpecName: "kube-api-access-ds7lx") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "kube-api-access-ds7lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.149771 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-inventory" (OuterVolumeSpecName: "inventory") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.151214 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "296bfc36-ce85-4db3-a692-acf7edf869b1" (UID: "296bfc36-ce85-4db3-a692-acf7edf869b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210147 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210228 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210260 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210293 4669 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210324 4669 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210357 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210387 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210412 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210441 4669 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210469 4669 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210495 4669 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210525 4669 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296bfc36-ce85-4db3-a692-acf7edf869b1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210585 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.210615 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds7lx\" (UniqueName: \"kubernetes.io/projected/296bfc36-ce85-4db3-a692-acf7edf869b1-kube-api-access-ds7lx\") on node \"crc\" DevicePath \"\"" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.638612 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" event={"ID":"296bfc36-ce85-4db3-a692-acf7edf869b1","Type":"ContainerDied","Data":"acc3ea6c8fa7d9b59a4e0c80025b239df45de2c3dd3c084203c7b44d63155288"} Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.639134 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc3ea6c8fa7d9b59a4e0c80025b239df45de2c3dd3c084203c7b44d63155288" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.638800 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-72m9r" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.760317 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5"] Oct 08 21:17:07 crc kubenswrapper[4669]: E1008 21:17:07.760792 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296bfc36-ce85-4db3-a692-acf7edf869b1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.760816 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="296bfc36-ce85-4db3-a692-acf7edf869b1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.761060 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="296bfc36-ce85-4db3-a692-acf7edf869b1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.761753 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.767868 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.768118 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.768264 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.768415 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.768635 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.790307 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5"] Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.824415 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.824569 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkdh\" (UniqueName: \"kubernetes.io/projected/eae3dd15-c997-43e0-8362-8a9210634436-kube-api-access-slkdh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.824776 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eae3dd15-c997-43e0-8362-8a9210634436-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.824827 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.824855 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.926571 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.926693 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkdh\" (UniqueName: \"kubernetes.io/projected/eae3dd15-c997-43e0-8362-8a9210634436-kube-api-access-slkdh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.926801 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eae3dd15-c997-43e0-8362-8a9210634436-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.926826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.926853 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.928206 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eae3dd15-c997-43e0-8362-8a9210634436-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.931194 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.931554 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.931918 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:07 crc kubenswrapper[4669]: I1008 21:17:07.951124 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkdh\" (UniqueName: \"kubernetes.io/projected/eae3dd15-c997-43e0-8362-8a9210634436-kube-api-access-slkdh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nlnn5\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:08 crc kubenswrapper[4669]: I1008 21:17:08.086065 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:17:08 crc kubenswrapper[4669]: I1008 21:17:08.651045 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5"] Oct 08 21:17:09 crc kubenswrapper[4669]: I1008 21:17:09.660990 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" event={"ID":"eae3dd15-c997-43e0-8362-8a9210634436","Type":"ContainerStarted","Data":"110237a500a5e41f298f00cd3bd7c0591dfc07ab89195bc896e5103959a06721"} Oct 08 21:17:10 crc kubenswrapper[4669]: I1008 21:17:10.675370 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" event={"ID":"eae3dd15-c997-43e0-8362-8a9210634436","Type":"ContainerStarted","Data":"e4fa40e6ac574c9f8783220b7ded66303293b308ba3ef8efd46933c3ed6c333f"} Oct 08 21:17:10 crc kubenswrapper[4669]: I1008 21:17:10.700257 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" podStartSLOduration=2.324384605 podStartE2EDuration="3.700228226s" podCreationTimestamp="2025-10-08 21:17:07 +0000 UTC" firstStartedPulling="2025-10-08 21:17:08.663430356 +0000 UTC m=+1948.356241029" lastFinishedPulling="2025-10-08 21:17:10.039273937 +0000 UTC m=+1949.732084650" observedRunningTime="2025-10-08 21:17:10.691710338 +0000 UTC m=+1950.384521011" watchObservedRunningTime="2025-10-08 21:17:10.700228226 +0000 UTC m=+1950.393038939" Oct 08 21:17:13 crc kubenswrapper[4669]: I1008 21:17:13.185920 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:17:13 crc kubenswrapper[4669]: I1008 21:17:13.186346 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:17:43 crc kubenswrapper[4669]: I1008 21:17:43.185818 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:17:43 crc kubenswrapper[4669]: I1008 21:17:43.186737 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:17:43 crc kubenswrapper[4669]: I1008 21:17:43.186826 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:17:43 crc kubenswrapper[4669]: I1008 21:17:43.188264 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d65055a3906618fe836fadf5c57a56b63f93978b81babdb5d1fe796cee4cdf9"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:17:43 crc kubenswrapper[4669]: I1008 21:17:43.188402 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://6d65055a3906618fe836fadf5c57a56b63f93978b81babdb5d1fe796cee4cdf9" gracePeriod=600 Oct 08 21:17:44 crc kubenswrapper[4669]: I1008 21:17:44.029246 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="6d65055a3906618fe836fadf5c57a56b63f93978b81babdb5d1fe796cee4cdf9" exitCode=0 Oct 08 21:17:44 crc kubenswrapper[4669]: I1008 21:17:44.029343 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"6d65055a3906618fe836fadf5c57a56b63f93978b81babdb5d1fe796cee4cdf9"} Oct 08 21:17:44 crc kubenswrapper[4669]: I1008 21:17:44.029805 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89"} Oct 08 21:17:44 crc kubenswrapper[4669]: I1008 21:17:44.029825 4669 scope.go:117] "RemoveContainer" containerID="92ec41f270c02e5372a81a61c641b036c617c32af9093736d30bfad2ba880074" Oct 08 21:18:17 crc kubenswrapper[4669]: I1008 21:18:17.381500 4669 generic.go:334] "Generic (PLEG): container finished" podID="eae3dd15-c997-43e0-8362-8a9210634436" containerID="e4fa40e6ac574c9f8783220b7ded66303293b308ba3ef8efd46933c3ed6c333f" exitCode=0 Oct 08 21:18:17 crc kubenswrapper[4669]: I1008 21:18:17.381689 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" event={"ID":"eae3dd15-c997-43e0-8362-8a9210634436","Type":"ContainerDied","Data":"e4fa40e6ac574c9f8783220b7ded66303293b308ba3ef8efd46933c3ed6c333f"} Oct 08 21:18:18 crc kubenswrapper[4669]: I1008 21:18:18.906154 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.084951 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ovn-combined-ca-bundle\") pod \"eae3dd15-c997-43e0-8362-8a9210634436\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.085085 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkdh\" (UniqueName: \"kubernetes.io/projected/eae3dd15-c997-43e0-8362-8a9210634436-kube-api-access-slkdh\") pod \"eae3dd15-c997-43e0-8362-8a9210634436\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.085206 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ssh-key\") pod \"eae3dd15-c997-43e0-8362-8a9210634436\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.085235 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-inventory\") pod \"eae3dd15-c997-43e0-8362-8a9210634436\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.085313 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eae3dd15-c997-43e0-8362-8a9210634436-ovncontroller-config-0\") pod \"eae3dd15-c997-43e0-8362-8a9210634436\" (UID: \"eae3dd15-c997-43e0-8362-8a9210634436\") " Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.091340 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eae3dd15-c997-43e0-8362-8a9210634436" (UID: "eae3dd15-c997-43e0-8362-8a9210634436"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.108707 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae3dd15-c997-43e0-8362-8a9210634436-kube-api-access-slkdh" (OuterVolumeSpecName: "kube-api-access-slkdh") pod "eae3dd15-c997-43e0-8362-8a9210634436" (UID: "eae3dd15-c997-43e0-8362-8a9210634436"). InnerVolumeSpecName "kube-api-access-slkdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.113041 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae3dd15-c997-43e0-8362-8a9210634436-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "eae3dd15-c997-43e0-8362-8a9210634436" (UID: "eae3dd15-c997-43e0-8362-8a9210634436"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.115790 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-inventory" (OuterVolumeSpecName: "inventory") pod "eae3dd15-c997-43e0-8362-8a9210634436" (UID: "eae3dd15-c997-43e0-8362-8a9210634436"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.126728 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eae3dd15-c997-43e0-8362-8a9210634436" (UID: "eae3dd15-c997-43e0-8362-8a9210634436"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.187226 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slkdh\" (UniqueName: \"kubernetes.io/projected/eae3dd15-c997-43e0-8362-8a9210634436-kube-api-access-slkdh\") on node \"crc\" DevicePath \"\"" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.187258 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.187270 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.187282 4669 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eae3dd15-c997-43e0-8362-8a9210634436-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.187295 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae3dd15-c997-43e0-8362-8a9210634436-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.400163 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" event={"ID":"eae3dd15-c997-43e0-8362-8a9210634436","Type":"ContainerDied","Data":"110237a500a5e41f298f00cd3bd7c0591dfc07ab89195bc896e5103959a06721"} Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.400207 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110237a500a5e41f298f00cd3bd7c0591dfc07ab89195bc896e5103959a06721" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.400750 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nlnn5" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.489281 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg"] Oct 08 21:18:19 crc kubenswrapper[4669]: E1008 21:18:19.489681 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae3dd15-c997-43e0-8362-8a9210634436" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.489694 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae3dd15-c997-43e0-8362-8a9210634436" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.489894 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae3dd15-c997-43e0-8362-8a9210634436" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.490603 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.493094 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.493180 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.493286 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.493304 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.493704 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.493756 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.503469 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg"] Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.594275 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.594331 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.594371 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddw6k\" (UniqueName: \"kubernetes.io/projected/2c0c5d80-cf44-45bb-847d-839dc3fd8887-kube-api-access-ddw6k\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.594405 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.594427 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.594518 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.695739 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.695803 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.695841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.695873 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddw6k\" (UniqueName: \"kubernetes.io/projected/2c0c5d80-cf44-45bb-847d-839dc3fd8887-kube-api-access-ddw6k\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.695905 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.695931 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.700644 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.703305 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.705898 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.709037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.712192 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.722125 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddw6k\" (UniqueName: \"kubernetes.io/projected/2c0c5d80-cf44-45bb-847d-839dc3fd8887-kube-api-access-ddw6k\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:19 crc kubenswrapper[4669]: I1008 21:18:19.813554 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:18:20 crc kubenswrapper[4669]: I1008 21:18:20.190538 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg"] Oct 08 21:18:20 crc kubenswrapper[4669]: I1008 21:18:20.409506 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" event={"ID":"2c0c5d80-cf44-45bb-847d-839dc3fd8887","Type":"ContainerStarted","Data":"422ae666609e55b5286487f906beb205024568a8d64ae92fbb284ec91cd7a55d"} Oct 08 21:18:21 crc kubenswrapper[4669]: I1008 21:18:21.423991 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" event={"ID":"2c0c5d80-cf44-45bb-847d-839dc3fd8887","Type":"ContainerStarted","Data":"c8690b8cbbbf217aa4e5125ca360797481b955e6375eeabbd947d805865cd1b6"} Oct 08 21:18:21 crc kubenswrapper[4669]: I1008 21:18:21.466513 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" podStartSLOduration=1.7786214949999999 podStartE2EDuration="2.466491696s" podCreationTimestamp="2025-10-08 21:18:19 +0000 UTC" firstStartedPulling="2025-10-08 21:18:20.197946516 +0000 UTC m=+2019.890757189" lastFinishedPulling="2025-10-08 21:18:20.885816707 +0000 UTC m=+2020.578627390" observedRunningTime="2025-10-08 21:18:21.44918626 +0000 UTC m=+2021.141996973" watchObservedRunningTime="2025-10-08 21:18:21.466491696 +0000 UTC m=+2021.159302379" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.554909 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4w5jl"] Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.558284 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.567005 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr76m\" (UniqueName: \"kubernetes.io/projected/ae6c5190-e7d0-4384-910d-620da4746aa0-kube-api-access-cr76m\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.567060 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae6c5190-e7d0-4384-910d-620da4746aa0-utilities\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.567291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae6c5190-e7d0-4384-910d-620da4746aa0-catalog-content\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.575464 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4w5jl"] Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.669307 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr76m\" (UniqueName: \"kubernetes.io/projected/ae6c5190-e7d0-4384-910d-620da4746aa0-kube-api-access-cr76m\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.669365 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae6c5190-e7d0-4384-910d-620da4746aa0-utilities\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.669462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae6c5190-e7d0-4384-910d-620da4746aa0-catalog-content\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.670008 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae6c5190-e7d0-4384-910d-620da4746aa0-catalog-content\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.670037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae6c5190-e7d0-4384-910d-620da4746aa0-utilities\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.699664 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr76m\" (UniqueName: \"kubernetes.io/projected/ae6c5190-e7d0-4384-910d-620da4746aa0-kube-api-access-cr76m\") pod \"certified-operators-4w5jl\" (UID: \"ae6c5190-e7d0-4384-910d-620da4746aa0\") " pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:06 crc kubenswrapper[4669]: I1008 21:19:06.893639 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:07 crc kubenswrapper[4669]: I1008 21:19:07.435820 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4w5jl"] Oct 08 21:19:07 crc kubenswrapper[4669]: W1008 21:19:07.437363 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae6c5190_e7d0_4384_910d_620da4746aa0.slice/crio-9481daa89a1cec5d467d99e531e27eebd1b99408c5c5f7dbbb90ac6b3ae775e0 WatchSource:0}: Error finding container 9481daa89a1cec5d467d99e531e27eebd1b99408c5c5f7dbbb90ac6b3ae775e0: Status 404 returned error can't find the container with id 9481daa89a1cec5d467d99e531e27eebd1b99408c5c5f7dbbb90ac6b3ae775e0 Oct 08 21:19:07 crc kubenswrapper[4669]: I1008 21:19:07.858320 4669 generic.go:334] "Generic (PLEG): container finished" podID="ae6c5190-e7d0-4384-910d-620da4746aa0" containerID="fb448e004819eb5a7100319e222789dbce09b29285fa85a67212efccf3575970" exitCode=0 Oct 08 21:19:07 crc kubenswrapper[4669]: I1008 21:19:07.858415 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w5jl" event={"ID":"ae6c5190-e7d0-4384-910d-620da4746aa0","Type":"ContainerDied","Data":"fb448e004819eb5a7100319e222789dbce09b29285fa85a67212efccf3575970"} Oct 08 21:19:07 crc kubenswrapper[4669]: I1008 21:19:07.858648 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w5jl" event={"ID":"ae6c5190-e7d0-4384-910d-620da4746aa0","Type":"ContainerStarted","Data":"9481daa89a1cec5d467d99e531e27eebd1b99408c5c5f7dbbb90ac6b3ae775e0"} Oct 08 21:19:10 crc kubenswrapper[4669]: I1008 21:19:10.884599 4669 generic.go:334] "Generic (PLEG): container finished" podID="2c0c5d80-cf44-45bb-847d-839dc3fd8887" containerID="c8690b8cbbbf217aa4e5125ca360797481b955e6375eeabbd947d805865cd1b6" exitCode=0 Oct 08 21:19:10 crc kubenswrapper[4669]: I1008 21:19:10.884668 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" event={"ID":"2c0c5d80-cf44-45bb-847d-839dc3fd8887","Type":"ContainerDied","Data":"c8690b8cbbbf217aa4e5125ca360797481b955e6375eeabbd947d805865cd1b6"} Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.586226 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.685362 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.685420 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-nova-metadata-neutron-config-0\") pod \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.685470 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-inventory\") pod \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.685561 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-ssh-key\") pod \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.685606 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-metadata-combined-ca-bundle\") pod \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.685697 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddw6k\" (UniqueName: \"kubernetes.io/projected/2c0c5d80-cf44-45bb-847d-839dc3fd8887-kube-api-access-ddw6k\") pod \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\" (UID: \"2c0c5d80-cf44-45bb-847d-839dc3fd8887\") " Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.690948 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2c0c5d80-cf44-45bb-847d-839dc3fd8887" (UID: "2c0c5d80-cf44-45bb-847d-839dc3fd8887"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.691162 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0c5d80-cf44-45bb-847d-839dc3fd8887-kube-api-access-ddw6k" (OuterVolumeSpecName: "kube-api-access-ddw6k") pod "2c0c5d80-cf44-45bb-847d-839dc3fd8887" (UID: "2c0c5d80-cf44-45bb-847d-839dc3fd8887"). InnerVolumeSpecName "kube-api-access-ddw6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.716344 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2c0c5d80-cf44-45bb-847d-839dc3fd8887" (UID: "2c0c5d80-cf44-45bb-847d-839dc3fd8887"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.717215 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-inventory" (OuterVolumeSpecName: "inventory") pod "2c0c5d80-cf44-45bb-847d-839dc3fd8887" (UID: "2c0c5d80-cf44-45bb-847d-839dc3fd8887"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.717659 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "2c0c5d80-cf44-45bb-847d-839dc3fd8887" (UID: "2c0c5d80-cf44-45bb-847d-839dc3fd8887"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.717707 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "2c0c5d80-cf44-45bb-847d-839dc3fd8887" (UID: "2c0c5d80-cf44-45bb-847d-839dc3fd8887"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.788588 4669 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.788650 4669 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.788668 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.788678 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.788690 4669 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c5d80-cf44-45bb-847d-839dc3fd8887-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.788731 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddw6k\" (UniqueName: \"kubernetes.io/projected/2c0c5d80-cf44-45bb-847d-839dc3fd8887-kube-api-access-ddw6k\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.904180 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.904662 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg" event={"ID":"2c0c5d80-cf44-45bb-847d-839dc3fd8887","Type":"ContainerDied","Data":"422ae666609e55b5286487f906beb205024568a8d64ae92fbb284ec91cd7a55d"} Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.904713 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422ae666609e55b5286487f906beb205024568a8d64ae92fbb284ec91cd7a55d" Oct 08 21:19:12 crc kubenswrapper[4669]: I1008 21:19:12.907276 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w5jl" event={"ID":"ae6c5190-e7d0-4384-910d-620da4746aa0","Type":"ContainerStarted","Data":"f84ebe2234fc83f55a4536c981a359cd4bd13f3e724429bbc5e30a9ec4ab6e29"} Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.013196 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2"] Oct 08 21:19:13 crc kubenswrapper[4669]: E1008 21:19:13.013992 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c5d80-cf44-45bb-847d-839dc3fd8887" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.014017 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c5d80-cf44-45bb-847d-839dc3fd8887" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.014264 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c5d80-cf44-45bb-847d-839dc3fd8887" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.015020 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.018414 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.018418 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.018992 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.019655 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.020299 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.022693 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2"] Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.197943 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.198007 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.198058 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.198158 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.198204 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fps\" (UniqueName: \"kubernetes.io/projected/f0494a3d-36c9-4d26-8f15-c1780af52f46-kube-api-access-s5fps\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.299644 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.299807 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.299896 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fps\" (UniqueName: \"kubernetes.io/projected/f0494a3d-36c9-4d26-8f15-c1780af52f46-kube-api-access-s5fps\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.299982 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.300032 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.304756 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.304877 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.305161 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.305523 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.324458 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fps\" (UniqueName: \"kubernetes.io/projected/f0494a3d-36c9-4d26-8f15-c1780af52f46-kube-api-access-s5fps\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.331592 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.905878 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2"] Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.925270 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" event={"ID":"f0494a3d-36c9-4d26-8f15-c1780af52f46","Type":"ContainerStarted","Data":"07a7fa774d1fce6a8568bc7fe0be4dbfb4695364c4c932494eb7a71534b408fb"} Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.929858 4669 generic.go:334] "Generic (PLEG): container finished" podID="ae6c5190-e7d0-4384-910d-620da4746aa0" containerID="f84ebe2234fc83f55a4536c981a359cd4bd13f3e724429bbc5e30a9ec4ab6e29" exitCode=0 Oct 08 21:19:13 crc kubenswrapper[4669]: I1008 21:19:13.929919 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w5jl" event={"ID":"ae6c5190-e7d0-4384-910d-620da4746aa0","Type":"ContainerDied","Data":"f84ebe2234fc83f55a4536c981a359cd4bd13f3e724429bbc5e30a9ec4ab6e29"} Oct 08 21:19:14 crc kubenswrapper[4669]: I1008 21:19:14.948787 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" event={"ID":"f0494a3d-36c9-4d26-8f15-c1780af52f46","Type":"ContainerStarted","Data":"38299223163eb65a6296b89201be9d26cb866f5217d59568372f8bac436b52ab"} Oct 08 21:19:14 crc kubenswrapper[4669]: I1008 21:19:14.954487 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w5jl" event={"ID":"ae6c5190-e7d0-4384-910d-620da4746aa0","Type":"ContainerStarted","Data":"85234f2be78452e83969555d031db9b54778f0fd04491787d9dc140078685dfa"} Oct 08 21:19:14 crc kubenswrapper[4669]: I1008 21:19:14.966909 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" podStartSLOduration=2.350550461 podStartE2EDuration="2.966889903s" podCreationTimestamp="2025-10-08 21:19:12 +0000 UTC" firstStartedPulling="2025-10-08 21:19:13.913685997 +0000 UTC m=+2073.606496680" lastFinishedPulling="2025-10-08 21:19:14.530025439 +0000 UTC m=+2074.222836122" observedRunningTime="2025-10-08 21:19:14.964860383 +0000 UTC m=+2074.657671056" watchObservedRunningTime="2025-10-08 21:19:14.966889903 +0000 UTC m=+2074.659700576" Oct 08 21:19:14 crc kubenswrapper[4669]: I1008 21:19:14.985785 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4w5jl" podStartSLOduration=2.173992346 podStartE2EDuration="8.985765699s" podCreationTimestamp="2025-10-08 21:19:06 +0000 UTC" firstStartedPulling="2025-10-08 21:19:07.861975861 +0000 UTC m=+2067.554786544" lastFinishedPulling="2025-10-08 21:19:14.673749214 +0000 UTC m=+2074.366559897" observedRunningTime="2025-10-08 21:19:14.979071651 +0000 UTC m=+2074.671882324" watchObservedRunningTime="2025-10-08 21:19:14.985765699 +0000 UTC m=+2074.678576372" Oct 08 21:19:16 crc kubenswrapper[4669]: I1008 21:19:16.894260 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:16 crc kubenswrapper[4669]: I1008 21:19:16.894862 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:17 crc kubenswrapper[4669]: I1008 21:19:17.960149 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4w5jl" podUID="ae6c5190-e7d0-4384-910d-620da4746aa0" containerName="registry-server" probeResult="failure" output=< Oct 08 21:19:17 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 21:19:17 crc kubenswrapper[4669]: > Oct 08 21:19:26 crc kubenswrapper[4669]: I1008 21:19:26.940179 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.026323 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4w5jl" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.103790 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4w5jl"] Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.181058 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hns7h"] Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.181302 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hns7h" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="registry-server" containerID="cri-o://5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf" gracePeriod=2 Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.699129 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.704511 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-utilities\") pod \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.704604 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh9xf\" (UniqueName: \"kubernetes.io/projected/e6ded9fb-15c9-44ee-a538-0b31da1b016a-kube-api-access-zh9xf\") pod \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.704738 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-catalog-content\") pod \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\" (UID: \"e6ded9fb-15c9-44ee-a538-0b31da1b016a\") " Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.704951 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-utilities" (OuterVolumeSpecName: "utilities") pod "e6ded9fb-15c9-44ee-a538-0b31da1b016a" (UID: "e6ded9fb-15c9-44ee-a538-0b31da1b016a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.705392 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.724162 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ded9fb-15c9-44ee-a538-0b31da1b016a-kube-api-access-zh9xf" (OuterVolumeSpecName: "kube-api-access-zh9xf") pod "e6ded9fb-15c9-44ee-a538-0b31da1b016a" (UID: "e6ded9fb-15c9-44ee-a538-0b31da1b016a"). InnerVolumeSpecName "kube-api-access-zh9xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.744760 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6ded9fb-15c9-44ee-a538-0b31da1b016a" (UID: "e6ded9fb-15c9-44ee-a538-0b31da1b016a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.807752 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh9xf\" (UniqueName: \"kubernetes.io/projected/e6ded9fb-15c9-44ee-a538-0b31da1b016a-kube-api-access-zh9xf\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:27 crc kubenswrapper[4669]: I1008 21:19:27.807786 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ded9fb-15c9-44ee-a538-0b31da1b016a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.071673 4669 generic.go:334] "Generic (PLEG): container finished" podID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerID="5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf" exitCode=0 Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.071745 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hns7h" event={"ID":"e6ded9fb-15c9-44ee-a538-0b31da1b016a","Type":"ContainerDied","Data":"5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf"} Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.071789 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hns7h" event={"ID":"e6ded9fb-15c9-44ee-a538-0b31da1b016a","Type":"ContainerDied","Data":"9044e4a4ff026452f6b299054e9331dc68b1b4ac810c8ac474aff9612acbaa9a"} Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.071811 4669 scope.go:117] "RemoveContainer" containerID="5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.071753 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hns7h" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.103466 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hns7h"] Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.109111 4669 scope.go:117] "RemoveContainer" containerID="252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.112257 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hns7h"] Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.132820 4669 scope.go:117] "RemoveContainer" containerID="cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.177932 4669 scope.go:117] "RemoveContainer" containerID="5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf" Oct 08 21:19:28 crc kubenswrapper[4669]: E1008 21:19:28.178790 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf\": container with ID starting with 5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf not found: ID does not exist" containerID="5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.178825 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf"} err="failed to get container status \"5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf\": rpc error: code = NotFound desc = could not find container \"5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf\": container with ID starting with 5f2fd02b38c193d1fb330ef1b77e337cbff471e022b666d4f350edc1070b8ccf not found: ID does not exist" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.178846 4669 scope.go:117] "RemoveContainer" containerID="252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801" Oct 08 21:19:28 crc kubenswrapper[4669]: E1008 21:19:28.179333 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801\": container with ID starting with 252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801 not found: ID does not exist" containerID="252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.179355 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801"} err="failed to get container status \"252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801\": rpc error: code = NotFound desc = could not find container \"252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801\": container with ID starting with 252e65fa660e9abdf04757d86dde22deaa64e1607212963737828d334fa42801 not found: ID does not exist" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.179372 4669 scope.go:117] "RemoveContainer" containerID="cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee" Oct 08 21:19:28 crc kubenswrapper[4669]: E1008 21:19:28.179722 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee\": container with ID starting with cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee not found: ID does not exist" containerID="cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee" Oct 08 21:19:28 crc kubenswrapper[4669]: I1008 21:19:28.179750 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee"} err="failed to get container status \"cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee\": rpc error: code = NotFound desc = could not find container \"cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee\": container with ID starting with cc6d1ab6d4c79513d64a14b3692bb2dd8f9c769a25e690da811a89a59d2c8aee not found: ID does not exist" Oct 08 21:19:29 crc kubenswrapper[4669]: I1008 21:19:29.341812 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" path="/var/lib/kubelet/pods/e6ded9fb-15c9-44ee-a538-0b31da1b016a/volumes" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.725295 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b5vzb"] Oct 08 21:19:41 crc kubenswrapper[4669]: E1008 21:19:41.726627 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="extract-content" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.726644 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="extract-content" Oct 08 21:19:41 crc kubenswrapper[4669]: E1008 21:19:41.726677 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="registry-server" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.726686 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="registry-server" Oct 08 21:19:41 crc kubenswrapper[4669]: E1008 21:19:41.726710 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="extract-utilities" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.726719 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="extract-utilities" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.726982 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ded9fb-15c9-44ee-a538-0b31da1b016a" containerName="registry-server" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.728827 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.733567 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5vzb"] Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.885804 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48fr\" (UniqueName: \"kubernetes.io/projected/7c7cea1a-f540-4410-af97-a29ad59cc64e-kube-api-access-j48fr\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.885885 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-utilities\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.885913 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-catalog-content\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.988718 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48fr\" (UniqueName: \"kubernetes.io/projected/7c7cea1a-f540-4410-af97-a29ad59cc64e-kube-api-access-j48fr\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.988811 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-utilities\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.988850 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-catalog-content\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.989505 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-catalog-content\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:41 crc kubenswrapper[4669]: I1008 21:19:41.989615 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-utilities\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:42 crc kubenswrapper[4669]: I1008 21:19:42.007030 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48fr\" (UniqueName: \"kubernetes.io/projected/7c7cea1a-f540-4410-af97-a29ad59cc64e-kube-api-access-j48fr\") pod \"community-operators-b5vzb\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:42 crc kubenswrapper[4669]: I1008 21:19:42.058317 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:42 crc kubenswrapper[4669]: I1008 21:19:42.559748 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5vzb"] Oct 08 21:19:42 crc kubenswrapper[4669]: W1008 21:19:42.563944 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7cea1a_f540_4410_af97_a29ad59cc64e.slice/crio-498d3ad7f09b2872b55ef7ada65820d6ef9c7997f2fe3161e78e6fb665419825 WatchSource:0}: Error finding container 498d3ad7f09b2872b55ef7ada65820d6ef9c7997f2fe3161e78e6fb665419825: Status 404 returned error can't find the container with id 498d3ad7f09b2872b55ef7ada65820d6ef9c7997f2fe3161e78e6fb665419825 Oct 08 21:19:43 crc kubenswrapper[4669]: I1008 21:19:43.186265 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:19:43 crc kubenswrapper[4669]: I1008 21:19:43.186713 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:19:43 crc kubenswrapper[4669]: I1008 21:19:43.244838 4669 generic.go:334] "Generic (PLEG): container finished" podID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerID="7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7" exitCode=0 Oct 08 21:19:43 crc kubenswrapper[4669]: I1008 21:19:43.244945 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerDied","Data":"7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7"} Oct 08 21:19:43 crc kubenswrapper[4669]: I1008 21:19:43.245019 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerStarted","Data":"498d3ad7f09b2872b55ef7ada65820d6ef9c7997f2fe3161e78e6fb665419825"} Oct 08 21:19:44 crc kubenswrapper[4669]: I1008 21:19:44.258796 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerStarted","Data":"cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624"} Oct 08 21:19:45 crc kubenswrapper[4669]: I1008 21:19:45.271213 4669 generic.go:334] "Generic (PLEG): container finished" podID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerID="cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624" exitCode=0 Oct 08 21:19:45 crc kubenswrapper[4669]: I1008 21:19:45.271522 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerDied","Data":"cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624"} Oct 08 21:19:46 crc kubenswrapper[4669]: I1008 21:19:46.283600 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerStarted","Data":"1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a"} Oct 08 21:19:46 crc kubenswrapper[4669]: I1008 21:19:46.305781 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b5vzb" podStartSLOduration=2.565447673 podStartE2EDuration="5.305762954s" podCreationTimestamp="2025-10-08 21:19:41 +0000 UTC" firstStartedPulling="2025-10-08 21:19:43.24801964 +0000 UTC m=+2102.940830353" lastFinishedPulling="2025-10-08 21:19:45.988334951 +0000 UTC m=+2105.681145634" observedRunningTime="2025-10-08 21:19:46.297670396 +0000 UTC m=+2105.990481069" watchObservedRunningTime="2025-10-08 21:19:46.305762954 +0000 UTC m=+2105.998573627" Oct 08 21:19:52 crc kubenswrapper[4669]: I1008 21:19:52.058682 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:52 crc kubenswrapper[4669]: I1008 21:19:52.059473 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:52 crc kubenswrapper[4669]: I1008 21:19:52.112959 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:52 crc kubenswrapper[4669]: I1008 21:19:52.382439 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:52 crc kubenswrapper[4669]: I1008 21:19:52.437436 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5vzb"] Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.358289 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b5vzb" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="registry-server" containerID="cri-o://1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a" gracePeriod=2 Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.761776 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lc6rm"] Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.767726 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.790204 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twmp\" (UniqueName: \"kubernetes.io/projected/854fed5a-fb3b-402c-80ff-84eca9ea88fe-kube-api-access-7twmp\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.790246 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-utilities\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.790263 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-catalog-content\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.801256 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lc6rm"] Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.825516 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.892271 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-utilities\") pod \"7c7cea1a-f540-4410-af97-a29ad59cc64e\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.892633 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48fr\" (UniqueName: \"kubernetes.io/projected/7c7cea1a-f540-4410-af97-a29ad59cc64e-kube-api-access-j48fr\") pod \"7c7cea1a-f540-4410-af97-a29ad59cc64e\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.892664 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-catalog-content\") pod \"7c7cea1a-f540-4410-af97-a29ad59cc64e\" (UID: \"7c7cea1a-f540-4410-af97-a29ad59cc64e\") " Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.893148 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twmp\" (UniqueName: \"kubernetes.io/projected/854fed5a-fb3b-402c-80ff-84eca9ea88fe-kube-api-access-7twmp\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.893195 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-utilities\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.893218 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-catalog-content\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.893288 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-utilities" (OuterVolumeSpecName: "utilities") pod "7c7cea1a-f540-4410-af97-a29ad59cc64e" (UID: "7c7cea1a-f540-4410-af97-a29ad59cc64e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.893884 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-utilities\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.893919 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-catalog-content\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.899095 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7cea1a-f540-4410-af97-a29ad59cc64e-kube-api-access-j48fr" (OuterVolumeSpecName: "kube-api-access-j48fr") pod "7c7cea1a-f540-4410-af97-a29ad59cc64e" (UID: "7c7cea1a-f540-4410-af97-a29ad59cc64e"). InnerVolumeSpecName "kube-api-access-j48fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.912179 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twmp\" (UniqueName: \"kubernetes.io/projected/854fed5a-fb3b-402c-80ff-84eca9ea88fe-kube-api-access-7twmp\") pod \"redhat-marketplace-lc6rm\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.995619 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:54 crc kubenswrapper[4669]: I1008 21:19:54.995651 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48fr\" (UniqueName: \"kubernetes.io/projected/7c7cea1a-f540-4410-af97-a29ad59cc64e-kube-api-access-j48fr\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.139843 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.375482 4669 generic.go:334] "Generic (PLEG): container finished" podID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerID="1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a" exitCode=0 Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.375549 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerDied","Data":"1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a"} Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.375792 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5vzb" event={"ID":"7c7cea1a-f540-4410-af97-a29ad59cc64e","Type":"ContainerDied","Data":"498d3ad7f09b2872b55ef7ada65820d6ef9c7997f2fe3161e78e6fb665419825"} Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.375814 4669 scope.go:117] "RemoveContainer" containerID="1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.375619 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5vzb" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.409237 4669 scope.go:117] "RemoveContainer" containerID="cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.448011 4669 scope.go:117] "RemoveContainer" containerID="7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.471459 4669 scope.go:117] "RemoveContainer" containerID="1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a" Oct 08 21:19:55 crc kubenswrapper[4669]: E1008 21:19:55.471855 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a\": container with ID starting with 1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a not found: ID does not exist" containerID="1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.471902 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a"} err="failed to get container status \"1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a\": rpc error: code = NotFound desc = could not find container \"1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a\": container with ID starting with 1b7991065d5eccac79f50a2c35bdfe240d23fd23ce34a9c3834aae0508f0609a not found: ID does not exist" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.471930 4669 scope.go:117] "RemoveContainer" containerID="cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624" Oct 08 21:19:55 crc kubenswrapper[4669]: E1008 21:19:55.472243 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624\": container with ID starting with cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624 not found: ID does not exist" containerID="cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.472284 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624"} err="failed to get container status \"cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624\": rpc error: code = NotFound desc = could not find container \"cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624\": container with ID starting with cc0b03b4513e0a3892a7530969c7289cb2e7e5e99b36c3f9a0fafddc274ae624 not found: ID does not exist" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.472313 4669 scope.go:117] "RemoveContainer" containerID="7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7" Oct 08 21:19:55 crc kubenswrapper[4669]: E1008 21:19:55.472675 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7\": container with ID starting with 7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7 not found: ID does not exist" containerID="7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.472747 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7"} err="failed to get container status \"7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7\": rpc error: code = NotFound desc = could not find container \"7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7\": container with ID starting with 7125a95d6f11079da843e04620add2ad783de173baa5c465ea61df14900535d7 not found: ID does not exist" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.624444 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c7cea1a-f540-4410-af97-a29ad59cc64e" (UID: "7c7cea1a-f540-4410-af97-a29ad59cc64e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.704720 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lc6rm"] Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.710580 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7cea1a-f540-4410-af97-a29ad59cc64e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.714923 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5vzb"] Oct 08 21:19:55 crc kubenswrapper[4669]: I1008 21:19:55.728371 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b5vzb"] Oct 08 21:19:56 crc kubenswrapper[4669]: I1008 21:19:56.390552 4669 generic.go:334] "Generic (PLEG): container finished" podID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerID="ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6" exitCode=0 Oct 08 21:19:56 crc kubenswrapper[4669]: I1008 21:19:56.390593 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lc6rm" event={"ID":"854fed5a-fb3b-402c-80ff-84eca9ea88fe","Type":"ContainerDied","Data":"ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6"} Oct 08 21:19:56 crc kubenswrapper[4669]: I1008 21:19:56.390635 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lc6rm" event={"ID":"854fed5a-fb3b-402c-80ff-84eca9ea88fe","Type":"ContainerStarted","Data":"b973623772ab2f337132034e4baf497eb01bfee820a4f40947008b2614ce9760"} Oct 08 21:19:57 crc kubenswrapper[4669]: I1008 21:19:57.346365 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" path="/var/lib/kubelet/pods/7c7cea1a-f540-4410-af97-a29ad59cc64e/volumes" Oct 08 21:19:58 crc kubenswrapper[4669]: I1008 21:19:58.411339 4669 generic.go:334] "Generic (PLEG): container finished" podID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerID="45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe" exitCode=0 Oct 08 21:19:58 crc kubenswrapper[4669]: I1008 21:19:58.411398 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lc6rm" event={"ID":"854fed5a-fb3b-402c-80ff-84eca9ea88fe","Type":"ContainerDied","Data":"45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe"} Oct 08 21:19:59 crc kubenswrapper[4669]: I1008 21:19:59.422489 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lc6rm" event={"ID":"854fed5a-fb3b-402c-80ff-84eca9ea88fe","Type":"ContainerStarted","Data":"c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd"} Oct 08 21:19:59 crc kubenswrapper[4669]: I1008 21:19:59.448211 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lc6rm" podStartSLOduration=2.8600849950000002 podStartE2EDuration="5.448190361s" podCreationTimestamp="2025-10-08 21:19:54 +0000 UTC" firstStartedPulling="2025-10-08 21:19:56.396812333 +0000 UTC m=+2116.089623006" lastFinishedPulling="2025-10-08 21:19:58.984917699 +0000 UTC m=+2118.677728372" observedRunningTime="2025-10-08 21:19:59.440877906 +0000 UTC m=+2119.133688579" watchObservedRunningTime="2025-10-08 21:19:59.448190361 +0000 UTC m=+2119.141001034" Oct 08 21:20:05 crc kubenswrapper[4669]: I1008 21:20:05.140753 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:20:05 crc kubenswrapper[4669]: I1008 21:20:05.141422 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:20:05 crc kubenswrapper[4669]: I1008 21:20:05.191427 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:20:05 crc kubenswrapper[4669]: I1008 21:20:05.550154 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:20:05 crc kubenswrapper[4669]: I1008 21:20:05.601245 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lc6rm"] Oct 08 21:20:07 crc kubenswrapper[4669]: I1008 21:20:07.504945 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lc6rm" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="registry-server" containerID="cri-o://c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd" gracePeriod=2 Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.017697 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.152274 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-catalog-content\") pod \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.152368 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7twmp\" (UniqueName: \"kubernetes.io/projected/854fed5a-fb3b-402c-80ff-84eca9ea88fe-kube-api-access-7twmp\") pod \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.152481 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-utilities\") pod \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\" (UID: \"854fed5a-fb3b-402c-80ff-84eca9ea88fe\") " Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.154922 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-utilities" (OuterVolumeSpecName: "utilities") pod "854fed5a-fb3b-402c-80ff-84eca9ea88fe" (UID: "854fed5a-fb3b-402c-80ff-84eca9ea88fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.161405 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854fed5a-fb3b-402c-80ff-84eca9ea88fe-kube-api-access-7twmp" (OuterVolumeSpecName: "kube-api-access-7twmp") pod "854fed5a-fb3b-402c-80ff-84eca9ea88fe" (UID: "854fed5a-fb3b-402c-80ff-84eca9ea88fe"). InnerVolumeSpecName "kube-api-access-7twmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.168072 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "854fed5a-fb3b-402c-80ff-84eca9ea88fe" (UID: "854fed5a-fb3b-402c-80ff-84eca9ea88fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.255365 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.255407 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854fed5a-fb3b-402c-80ff-84eca9ea88fe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.255424 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7twmp\" (UniqueName: \"kubernetes.io/projected/854fed5a-fb3b-402c-80ff-84eca9ea88fe-kube-api-access-7twmp\") on node \"crc\" DevicePath \"\"" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.514225 4669 generic.go:334] "Generic (PLEG): container finished" podID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerID="c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd" exitCode=0 Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.514266 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lc6rm" event={"ID":"854fed5a-fb3b-402c-80ff-84eca9ea88fe","Type":"ContainerDied","Data":"c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd"} Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.514480 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lc6rm" event={"ID":"854fed5a-fb3b-402c-80ff-84eca9ea88fe","Type":"ContainerDied","Data":"b973623772ab2f337132034e4baf497eb01bfee820a4f40947008b2614ce9760"} Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.514315 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lc6rm" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.514499 4669 scope.go:117] "RemoveContainer" containerID="c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.531441 4669 scope.go:117] "RemoveContainer" containerID="45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.549701 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lc6rm"] Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.558981 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lc6rm"] Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.569717 4669 scope.go:117] "RemoveContainer" containerID="ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.599338 4669 scope.go:117] "RemoveContainer" containerID="c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd" Oct 08 21:20:08 crc kubenswrapper[4669]: E1008 21:20:08.599807 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd\": container with ID starting with c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd not found: ID does not exist" containerID="c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.599838 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd"} err="failed to get container status \"c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd\": rpc error: code = NotFound desc = could not find container \"c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd\": container with ID starting with c2f297a8326d07ec6ac05f2645206d74b6a49bd2881996cf56cef8fd9c1676dd not found: ID does not exist" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.599868 4669 scope.go:117] "RemoveContainer" containerID="45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe" Oct 08 21:20:08 crc kubenswrapper[4669]: E1008 21:20:08.600105 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe\": container with ID starting with 45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe not found: ID does not exist" containerID="45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.600130 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe"} err="failed to get container status \"45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe\": rpc error: code = NotFound desc = could not find container \"45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe\": container with ID starting with 45df0c12ef4482b1907d3d0062447689b85094f62c3fbdd1de674fbfd1ecd3fe not found: ID does not exist" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.600142 4669 scope.go:117] "RemoveContainer" containerID="ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6" Oct 08 21:20:08 crc kubenswrapper[4669]: E1008 21:20:08.600394 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6\": container with ID starting with ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6 not found: ID does not exist" containerID="ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6" Oct 08 21:20:08 crc kubenswrapper[4669]: I1008 21:20:08.600413 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6"} err="failed to get container status \"ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6\": rpc error: code = NotFound desc = could not find container \"ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6\": container with ID starting with ec068aa44eb8d069700efa8d3ff40f51d8ba2a77243bd51e946bfbc32d1b72d6 not found: ID does not exist" Oct 08 21:20:09 crc kubenswrapper[4669]: I1008 21:20:09.352809 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" path="/var/lib/kubelet/pods/854fed5a-fb3b-402c-80ff-84eca9ea88fe/volumes" Oct 08 21:20:13 crc kubenswrapper[4669]: I1008 21:20:13.185670 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:20:13 crc kubenswrapper[4669]: I1008 21:20:13.186403 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.186047 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.186715 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.186774 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.187724 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.187816 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" gracePeriod=600 Oct 08 21:20:43 crc kubenswrapper[4669]: E1008 21:20:43.343304 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.902107 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" exitCode=0 Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.902155 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89"} Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.902191 4669 scope.go:117] "RemoveContainer" containerID="6d65055a3906618fe836fadf5c57a56b63f93978b81babdb5d1fe796cee4cdf9" Oct 08 21:20:43 crc kubenswrapper[4669]: I1008 21:20:43.902892 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:20:43 crc kubenswrapper[4669]: E1008 21:20:43.903194 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:20:56 crc kubenswrapper[4669]: I1008 21:20:56.330976 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:20:56 crc kubenswrapper[4669]: E1008 21:20:56.331801 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:21:07 crc kubenswrapper[4669]: I1008 21:21:07.331055 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:21:07 crc kubenswrapper[4669]: E1008 21:21:07.331884 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:21:22 crc kubenswrapper[4669]: I1008 21:21:22.330980 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:21:22 crc kubenswrapper[4669]: E1008 21:21:22.332006 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:21:37 crc kubenswrapper[4669]: I1008 21:21:37.332413 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:21:37 crc kubenswrapper[4669]: E1008 21:21:37.333941 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:21:49 crc kubenswrapper[4669]: I1008 21:21:49.332167 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:21:49 crc kubenswrapper[4669]: E1008 21:21:49.333521 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:22:03 crc kubenswrapper[4669]: I1008 21:22:03.331841 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:22:03 crc kubenswrapper[4669]: E1008 21:22:03.332875 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:22:16 crc kubenswrapper[4669]: I1008 21:22:16.330488 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:22:16 crc kubenswrapper[4669]: E1008 21:22:16.331399 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:22:30 crc kubenswrapper[4669]: I1008 21:22:30.330905 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:22:30 crc kubenswrapper[4669]: E1008 21:22:30.331781 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.893728 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fjqnt"] Oct 08 21:22:31 crc kubenswrapper[4669]: E1008 21:22:31.894804 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="registry-server" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.894837 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="registry-server" Oct 08 21:22:31 crc kubenswrapper[4669]: E1008 21:22:31.894874 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="extract-utilities" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.894890 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="extract-utilities" Oct 08 21:22:31 crc kubenswrapper[4669]: E1008 21:22:31.894912 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="extract-utilities" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.894930 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="extract-utilities" Oct 08 21:22:31 crc kubenswrapper[4669]: E1008 21:22:31.894974 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="registry-server" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.894994 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="registry-server" Oct 08 21:22:31 crc kubenswrapper[4669]: E1008 21:22:31.895030 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="extract-content" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.895044 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="extract-content" Oct 08 21:22:31 crc kubenswrapper[4669]: E1008 21:22:31.895071 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="extract-content" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.895083 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="extract-content" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.895515 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7cea1a-f540-4410-af97-a29ad59cc64e" containerName="registry-server" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.895592 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="854fed5a-fb3b-402c-80ff-84eca9ea88fe" containerName="registry-server" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.898245 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.916414 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-utilities\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.916538 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-catalog-content\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.916622 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh4kb\" (UniqueName: \"kubernetes.io/projected/80d5e58b-cc5d-43bf-9698-244e2056d3c6-kube-api-access-xh4kb\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:31 crc kubenswrapper[4669]: I1008 21:22:31.932438 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjqnt"] Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.018542 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-utilities\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.019081 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-catalog-content\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.019310 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh4kb\" (UniqueName: \"kubernetes.io/projected/80d5e58b-cc5d-43bf-9698-244e2056d3c6-kube-api-access-xh4kb\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.019548 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-utilities\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.019560 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-catalog-content\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.038669 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh4kb\" (UniqueName: \"kubernetes.io/projected/80d5e58b-cc5d-43bf-9698-244e2056d3c6-kube-api-access-xh4kb\") pod \"redhat-operators-fjqnt\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.232199 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:32 crc kubenswrapper[4669]: I1008 21:22:32.750519 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjqnt"] Oct 08 21:22:33 crc kubenswrapper[4669]: I1008 21:22:33.096010 4669 generic.go:334] "Generic (PLEG): container finished" podID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerID="8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80" exitCode=0 Oct 08 21:22:33 crc kubenswrapper[4669]: I1008 21:22:33.096045 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjqnt" event={"ID":"80d5e58b-cc5d-43bf-9698-244e2056d3c6","Type":"ContainerDied","Data":"8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80"} Oct 08 21:22:33 crc kubenswrapper[4669]: I1008 21:22:33.096069 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjqnt" event={"ID":"80d5e58b-cc5d-43bf-9698-244e2056d3c6","Type":"ContainerStarted","Data":"6fe2df6beca28132d0451590dc9016f5394e5372a3be2c2ecebe1202dd1de5ae"} Oct 08 21:22:33 crc kubenswrapper[4669]: I1008 21:22:33.099213 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:22:35 crc kubenswrapper[4669]: I1008 21:22:35.115609 4669 generic.go:334] "Generic (PLEG): container finished" podID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerID="b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2" exitCode=0 Oct 08 21:22:35 crc kubenswrapper[4669]: I1008 21:22:35.115701 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjqnt" event={"ID":"80d5e58b-cc5d-43bf-9698-244e2056d3c6","Type":"ContainerDied","Data":"b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2"} Oct 08 21:22:36 crc kubenswrapper[4669]: I1008 21:22:36.126287 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjqnt" event={"ID":"80d5e58b-cc5d-43bf-9698-244e2056d3c6","Type":"ContainerStarted","Data":"9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d"} Oct 08 21:22:36 crc kubenswrapper[4669]: I1008 21:22:36.151233 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fjqnt" podStartSLOduration=2.68585985 podStartE2EDuration="5.151213361s" podCreationTimestamp="2025-10-08 21:22:31 +0000 UTC" firstStartedPulling="2025-10-08 21:22:33.098925681 +0000 UTC m=+2272.791736354" lastFinishedPulling="2025-10-08 21:22:35.564279172 +0000 UTC m=+2275.257089865" observedRunningTime="2025-10-08 21:22:36.142988241 +0000 UTC m=+2275.835798914" watchObservedRunningTime="2025-10-08 21:22:36.151213361 +0000 UTC m=+2275.844024034" Oct 08 21:22:42 crc kubenswrapper[4669]: I1008 21:22:42.232694 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:42 crc kubenswrapper[4669]: I1008 21:22:42.234641 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:43 crc kubenswrapper[4669]: I1008 21:22:43.284445 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fjqnt" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="registry-server" probeResult="failure" output=< Oct 08 21:22:43 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 21:22:43 crc kubenswrapper[4669]: > Oct 08 21:22:45 crc kubenswrapper[4669]: I1008 21:22:45.331569 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:22:45 crc kubenswrapper[4669]: E1008 21:22:45.332096 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:22:52 crc kubenswrapper[4669]: I1008 21:22:52.305026 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:52 crc kubenswrapper[4669]: I1008 21:22:52.395144 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:52 crc kubenswrapper[4669]: I1008 21:22:52.555270 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjqnt"] Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.307914 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fjqnt" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="registry-server" containerID="cri-o://9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d" gracePeriod=2 Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.782875 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.893201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh4kb\" (UniqueName: \"kubernetes.io/projected/80d5e58b-cc5d-43bf-9698-244e2056d3c6-kube-api-access-xh4kb\") pod \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.893345 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-catalog-content\") pod \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.893393 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-utilities\") pod \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\" (UID: \"80d5e58b-cc5d-43bf-9698-244e2056d3c6\") " Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.894274 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-utilities" (OuterVolumeSpecName: "utilities") pod "80d5e58b-cc5d-43bf-9698-244e2056d3c6" (UID: "80d5e58b-cc5d-43bf-9698-244e2056d3c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.899345 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d5e58b-cc5d-43bf-9698-244e2056d3c6-kube-api-access-xh4kb" (OuterVolumeSpecName: "kube-api-access-xh4kb") pod "80d5e58b-cc5d-43bf-9698-244e2056d3c6" (UID: "80d5e58b-cc5d-43bf-9698-244e2056d3c6"). InnerVolumeSpecName "kube-api-access-xh4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.972055 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80d5e58b-cc5d-43bf-9698-244e2056d3c6" (UID: "80d5e58b-cc5d-43bf-9698-244e2056d3c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.996103 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh4kb\" (UniqueName: \"kubernetes.io/projected/80d5e58b-cc5d-43bf-9698-244e2056d3c6-kube-api-access-xh4kb\") on node \"crc\" DevicePath \"\"" Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.996132 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:22:54 crc kubenswrapper[4669]: I1008 21:22:54.996144 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d5e58b-cc5d-43bf-9698-244e2056d3c6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.325489 4669 generic.go:334] "Generic (PLEG): container finished" podID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerID="9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d" exitCode=0 Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.326036 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjqnt" event={"ID":"80d5e58b-cc5d-43bf-9698-244e2056d3c6","Type":"ContainerDied","Data":"9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d"} Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.326091 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjqnt" event={"ID":"80d5e58b-cc5d-43bf-9698-244e2056d3c6","Type":"ContainerDied","Data":"6fe2df6beca28132d0451590dc9016f5394e5372a3be2c2ecebe1202dd1de5ae"} Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.326128 4669 scope.go:117] "RemoveContainer" containerID="9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.326381 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjqnt" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.352365 4669 scope.go:117] "RemoveContainer" containerID="b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.388313 4669 scope.go:117] "RemoveContainer" containerID="8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.392121 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjqnt"] Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.402677 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fjqnt"] Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.438214 4669 scope.go:117] "RemoveContainer" containerID="9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d" Oct 08 21:22:55 crc kubenswrapper[4669]: E1008 21:22:55.438809 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d\": container with ID starting with 9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d not found: ID does not exist" containerID="9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.438846 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d"} err="failed to get container status \"9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d\": rpc error: code = NotFound desc = could not find container \"9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d\": container with ID starting with 9a1dad999751f9182cfa5c8549bca73212290f76a77e7f6dc7f887948da2b66d not found: ID does not exist" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.438867 4669 scope.go:117] "RemoveContainer" containerID="b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2" Oct 08 21:22:55 crc kubenswrapper[4669]: E1008 21:22:55.439241 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2\": container with ID starting with b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2 not found: ID does not exist" containerID="b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.439261 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2"} err="failed to get container status \"b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2\": rpc error: code = NotFound desc = could not find container \"b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2\": container with ID starting with b6cfda9c418bef7c93b12cc80a2a40924567f541311953f884c409f35ecc53a2 not found: ID does not exist" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.439275 4669 scope.go:117] "RemoveContainer" containerID="8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80" Oct 08 21:22:55 crc kubenswrapper[4669]: E1008 21:22:55.439442 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80\": container with ID starting with 8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80 not found: ID does not exist" containerID="8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80" Oct 08 21:22:55 crc kubenswrapper[4669]: I1008 21:22:55.439458 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80"} err="failed to get container status \"8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80\": rpc error: code = NotFound desc = could not find container \"8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80\": container with ID starting with 8587a0a6c1a5cdebb5350304c4d087d9c31006bd72442c798f0355d0f56f1b80 not found: ID does not exist" Oct 08 21:22:56 crc kubenswrapper[4669]: I1008 21:22:56.334938 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:22:56 crc kubenswrapper[4669]: E1008 21:22:56.335470 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:22:57 crc kubenswrapper[4669]: I1008 21:22:57.355151 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" path="/var/lib/kubelet/pods/80d5e58b-cc5d-43bf-9698-244e2056d3c6/volumes" Oct 08 21:23:08 crc kubenswrapper[4669]: I1008 21:23:08.331344 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:23:08 crc kubenswrapper[4669]: E1008 21:23:08.332792 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:23:19 crc kubenswrapper[4669]: I1008 21:23:19.331195 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:23:19 crc kubenswrapper[4669]: E1008 21:23:19.332596 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:23:31 crc kubenswrapper[4669]: I1008 21:23:31.336168 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:23:31 crc kubenswrapper[4669]: E1008 21:23:31.336861 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:23:31 crc kubenswrapper[4669]: I1008 21:23:31.708388 4669 generic.go:334] "Generic (PLEG): container finished" podID="f0494a3d-36c9-4d26-8f15-c1780af52f46" containerID="38299223163eb65a6296b89201be9d26cb866f5217d59568372f8bac436b52ab" exitCode=0 Oct 08 21:23:31 crc kubenswrapper[4669]: I1008 21:23:31.708493 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" event={"ID":"f0494a3d-36c9-4d26-8f15-c1780af52f46","Type":"ContainerDied","Data":"38299223163eb65a6296b89201be9d26cb866f5217d59568372f8bac436b52ab"} Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.146640 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.233742 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-ssh-key\") pod \"f0494a3d-36c9-4d26-8f15-c1780af52f46\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.233860 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-combined-ca-bundle\") pod \"f0494a3d-36c9-4d26-8f15-c1780af52f46\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.233970 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-inventory\") pod \"f0494a3d-36c9-4d26-8f15-c1780af52f46\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.234039 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fps\" (UniqueName: \"kubernetes.io/projected/f0494a3d-36c9-4d26-8f15-c1780af52f46-kube-api-access-s5fps\") pod \"f0494a3d-36c9-4d26-8f15-c1780af52f46\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.234089 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-secret-0\") pod \"f0494a3d-36c9-4d26-8f15-c1780af52f46\" (UID: \"f0494a3d-36c9-4d26-8f15-c1780af52f46\") " Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.244877 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f0494a3d-36c9-4d26-8f15-c1780af52f46" (UID: "f0494a3d-36c9-4d26-8f15-c1780af52f46"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.245183 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0494a3d-36c9-4d26-8f15-c1780af52f46-kube-api-access-s5fps" (OuterVolumeSpecName: "kube-api-access-s5fps") pod "f0494a3d-36c9-4d26-8f15-c1780af52f46" (UID: "f0494a3d-36c9-4d26-8f15-c1780af52f46"). InnerVolumeSpecName "kube-api-access-s5fps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.263986 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f0494a3d-36c9-4d26-8f15-c1780af52f46" (UID: "f0494a3d-36c9-4d26-8f15-c1780af52f46"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.266023 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-inventory" (OuterVolumeSpecName: "inventory") pod "f0494a3d-36c9-4d26-8f15-c1780af52f46" (UID: "f0494a3d-36c9-4d26-8f15-c1780af52f46"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.287986 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f0494a3d-36c9-4d26-8f15-c1780af52f46" (UID: "f0494a3d-36c9-4d26-8f15-c1780af52f46"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.336458 4669 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.336492 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.336502 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fps\" (UniqueName: \"kubernetes.io/projected/f0494a3d-36c9-4d26-8f15-c1780af52f46-kube-api-access-s5fps\") on node \"crc\" DevicePath \"\"" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.336512 4669 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.336521 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0494a3d-36c9-4d26-8f15-c1780af52f46-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.732224 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" event={"ID":"f0494a3d-36c9-4d26-8f15-c1780af52f46","Type":"ContainerDied","Data":"07a7fa774d1fce6a8568bc7fe0be4dbfb4695364c4c932494eb7a71534b408fb"} Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.732620 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07a7fa774d1fce6a8568bc7fe0be4dbfb4695364c4c932494eb7a71534b408fb" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.732693 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.839892 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq"] Oct 08 21:23:33 crc kubenswrapper[4669]: E1008 21:23:33.840693 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="extract-utilities" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.840835 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="extract-utilities" Oct 08 21:23:33 crc kubenswrapper[4669]: E1008 21:23:33.840934 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="extract-content" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.841053 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="extract-content" Oct 08 21:23:33 crc kubenswrapper[4669]: E1008 21:23:33.841169 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="registry-server" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.841257 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="registry-server" Oct 08 21:23:33 crc kubenswrapper[4669]: E1008 21:23:33.841365 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0494a3d-36c9-4d26-8f15-c1780af52f46" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.841455 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0494a3d-36c9-4d26-8f15-c1780af52f46" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.841910 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d5e58b-cc5d-43bf-9698-244e2056d3c6" containerName="registry-server" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.842036 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0494a3d-36c9-4d26-8f15-c1780af52f46" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.843309 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.850339 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.852741 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.852779 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.852930 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.853028 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.853177 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.853308 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.864127 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq"] Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950284 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950327 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950363 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950465 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950507 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rf8\" (UniqueName: \"kubernetes.io/projected/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-kube-api-access-d6rf8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950552 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950603 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:33 crc kubenswrapper[4669]: I1008 21:23:33.950648 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052076 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052223 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052307 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052387 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052419 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052471 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052618 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052697 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.052755 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rf8\" (UniqueName: \"kubernetes.io/projected/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-kube-api-access-d6rf8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.054759 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.058424 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.058835 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.059001 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.059323 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.060235 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.060272 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.061439 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.077447 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rf8\" (UniqueName: \"kubernetes.io/projected/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-kube-api-access-d6rf8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tj7fq\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.165371 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:23:34 crc kubenswrapper[4669]: I1008 21:23:34.755111 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq"] Oct 08 21:23:35 crc kubenswrapper[4669]: I1008 21:23:35.758842 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" event={"ID":"9500e0b9-017f-4e4c-b72c-cbe0c98f7660","Type":"ContainerStarted","Data":"6c70d46435fce218644d75fe443f774089706730dade8c7458d1e641ee5ae3bf"} Oct 08 21:23:36 crc kubenswrapper[4669]: I1008 21:23:36.772966 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" event={"ID":"9500e0b9-017f-4e4c-b72c-cbe0c98f7660","Type":"ContainerStarted","Data":"ae9644d78b2a8618e2ab21b397231a81e04035d4028c1ae569da7ea7e62039fb"} Oct 08 21:23:36 crc kubenswrapper[4669]: I1008 21:23:36.807132 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" podStartSLOduration=2.906501961 podStartE2EDuration="3.80711502s" podCreationTimestamp="2025-10-08 21:23:33 +0000 UTC" firstStartedPulling="2025-10-08 21:23:34.768960113 +0000 UTC m=+2334.461770786" lastFinishedPulling="2025-10-08 21:23:35.669573132 +0000 UTC m=+2335.362383845" observedRunningTime="2025-10-08 21:23:36.796384056 +0000 UTC m=+2336.489194779" watchObservedRunningTime="2025-10-08 21:23:36.80711502 +0000 UTC m=+2336.499925693" Oct 08 21:23:45 crc kubenswrapper[4669]: I1008 21:23:45.332105 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:23:45 crc kubenswrapper[4669]: E1008 21:23:45.332995 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:23:58 crc kubenswrapper[4669]: I1008 21:23:58.330787 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:23:58 crc kubenswrapper[4669]: E1008 21:23:58.331599 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:24:13 crc kubenswrapper[4669]: I1008 21:24:13.330833 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:24:13 crc kubenswrapper[4669]: E1008 21:24:13.331645 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:24:24 crc kubenswrapper[4669]: I1008 21:24:24.331794 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:24:24 crc kubenswrapper[4669]: E1008 21:24:24.332724 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:24:39 crc kubenswrapper[4669]: I1008 21:24:39.330594 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:24:39 crc kubenswrapper[4669]: E1008 21:24:39.331573 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:24:53 crc kubenswrapper[4669]: I1008 21:24:53.331637 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:24:53 crc kubenswrapper[4669]: E1008 21:24:53.332413 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:25:08 crc kubenswrapper[4669]: I1008 21:25:08.331310 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:25:08 crc kubenswrapper[4669]: E1008 21:25:08.332030 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:25:20 crc kubenswrapper[4669]: I1008 21:25:20.331432 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:25:20 crc kubenswrapper[4669]: E1008 21:25:20.332122 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:25:34 crc kubenswrapper[4669]: I1008 21:25:34.331036 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:25:34 crc kubenswrapper[4669]: E1008 21:25:34.331722 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:25:48 crc kubenswrapper[4669]: I1008 21:25:48.331776 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:25:49 crc kubenswrapper[4669]: I1008 21:25:49.067584 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"60e75bf09f786d92a4220856b0c1ccdb7a5f5c2d86e9092f16ff831633898359"} Oct 08 21:27:01 crc kubenswrapper[4669]: I1008 21:27:01.799994 4669 generic.go:334] "Generic (PLEG): container finished" podID="9500e0b9-017f-4e4c-b72c-cbe0c98f7660" containerID="ae9644d78b2a8618e2ab21b397231a81e04035d4028c1ae569da7ea7e62039fb" exitCode=0 Oct 08 21:27:01 crc kubenswrapper[4669]: I1008 21:27:01.800083 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" event={"ID":"9500e0b9-017f-4e4c-b72c-cbe0c98f7660","Type":"ContainerDied","Data":"ae9644d78b2a8618e2ab21b397231a81e04035d4028c1ae569da7ea7e62039fb"} Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.277518 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467414 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-0\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467469 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-inventory\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467489 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-extra-config-0\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467540 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-0\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467776 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-ssh-key\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467856 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6rf8\" (UniqueName: \"kubernetes.io/projected/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-kube-api-access-d6rf8\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467886 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-combined-ca-bundle\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.467919 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-1\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.468004 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-1\") pod \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\" (UID: \"9500e0b9-017f-4e4c-b72c-cbe0c98f7660\") " Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.476118 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.476202 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-kube-api-access-d6rf8" (OuterVolumeSpecName: "kube-api-access-d6rf8") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "kube-api-access-d6rf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.510839 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-inventory" (OuterVolumeSpecName: "inventory") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.518986 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.521397 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.525156 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.527811 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.529185 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.536444 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9500e0b9-017f-4e4c-b72c-cbe0c98f7660" (UID: "9500e0b9-017f-4e4c-b72c-cbe0c98f7660"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571591 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6rf8\" (UniqueName: \"kubernetes.io/projected/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-kube-api-access-d6rf8\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571639 4669 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571652 4669 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571665 4669 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571678 4669 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571690 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571701 4669 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571712 4669 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.571722 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9500e0b9-017f-4e4c-b72c-cbe0c98f7660-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.817419 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" event={"ID":"9500e0b9-017f-4e4c-b72c-cbe0c98f7660","Type":"ContainerDied","Data":"6c70d46435fce218644d75fe443f774089706730dade8c7458d1e641ee5ae3bf"} Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.817457 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c70d46435fce218644d75fe443f774089706730dade8c7458d1e641ee5ae3bf" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.817474 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tj7fq" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.917161 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb"] Oct 08 21:27:03 crc kubenswrapper[4669]: E1008 21:27:03.917887 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9500e0b9-017f-4e4c-b72c-cbe0c98f7660" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.917907 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9500e0b9-017f-4e4c-b72c-cbe0c98f7660" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.918097 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9500e0b9-017f-4e4c-b72c-cbe0c98f7660" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.918749 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.921237 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.921786 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.922004 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9d8p9" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.922259 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.923707 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.935248 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb"] Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.980937 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.981076 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.981178 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2st\" (UniqueName: \"kubernetes.io/projected/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-kube-api-access-rk2st\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.981240 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.981340 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.981560 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:03 crc kubenswrapper[4669]: I1008 21:27:03.981616 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.083759 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.083826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.083931 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.083981 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.084034 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2st\" (UniqueName: \"kubernetes.io/projected/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-kube-api-access-rk2st\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.084058 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.084090 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.088459 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.088479 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.088666 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.088919 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.089073 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.101312 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.103342 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2st\" (UniqueName: \"kubernetes.io/projected/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-kube-api-access-rk2st\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.236552 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.737619 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb"] Oct 08 21:27:04 crc kubenswrapper[4669]: W1008 21:27:04.742362 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb78e48d_ddf1_494e_883f_d9987d2f0f0a.slice/crio-696143ffbc35ce6be62fe2e3107c96beee3a7a8205d354a1258d4fe7bb139bfb WatchSource:0}: Error finding container 696143ffbc35ce6be62fe2e3107c96beee3a7a8205d354a1258d4fe7bb139bfb: Status 404 returned error can't find the container with id 696143ffbc35ce6be62fe2e3107c96beee3a7a8205d354a1258d4fe7bb139bfb Oct 08 21:27:04 crc kubenswrapper[4669]: I1008 21:27:04.825295 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" event={"ID":"bb78e48d-ddf1-494e-883f-d9987d2f0f0a","Type":"ContainerStarted","Data":"696143ffbc35ce6be62fe2e3107c96beee3a7a8205d354a1258d4fe7bb139bfb"} Oct 08 21:27:05 crc kubenswrapper[4669]: I1008 21:27:05.834961 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" event={"ID":"bb78e48d-ddf1-494e-883f-d9987d2f0f0a","Type":"ContainerStarted","Data":"e65fbccd717fbb53480a1dcc055c831e8bea5a4bba01b97a37d6d801ef9b6d65"} Oct 08 21:27:05 crc kubenswrapper[4669]: I1008 21:27:05.862911 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" podStartSLOduration=2.18263418 podStartE2EDuration="2.862893158s" podCreationTimestamp="2025-10-08 21:27:03 +0000 UTC" firstStartedPulling="2025-10-08 21:27:04.74495853 +0000 UTC m=+2544.437769203" lastFinishedPulling="2025-10-08 21:27:05.425217468 +0000 UTC m=+2545.118028181" observedRunningTime="2025-10-08 21:27:05.855299551 +0000 UTC m=+2545.548110234" watchObservedRunningTime="2025-10-08 21:27:05.862893158 +0000 UTC m=+2545.555703841" Oct 08 21:28:13 crc kubenswrapper[4669]: I1008 21:28:13.185592 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:28:13 crc kubenswrapper[4669]: I1008 21:28:13.186355 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:28:43 crc kubenswrapper[4669]: I1008 21:28:43.185681 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:28:43 crc kubenswrapper[4669]: I1008 21:28:43.186282 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:29:13 crc kubenswrapper[4669]: I1008 21:29:13.185916 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:29:13 crc kubenswrapper[4669]: I1008 21:29:13.186592 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:29:13 crc kubenswrapper[4669]: I1008 21:29:13.186655 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:29:13 crc kubenswrapper[4669]: I1008 21:29:13.187412 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60e75bf09f786d92a4220856b0c1ccdb7a5f5c2d86e9092f16ff831633898359"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:29:13 crc kubenswrapper[4669]: I1008 21:29:13.187496 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://60e75bf09f786d92a4220856b0c1ccdb7a5f5c2d86e9092f16ff831633898359" gracePeriod=600 Oct 08 21:29:14 crc kubenswrapper[4669]: I1008 21:29:14.108188 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="60e75bf09f786d92a4220856b0c1ccdb7a5f5c2d86e9092f16ff831633898359" exitCode=0 Oct 08 21:29:14 crc kubenswrapper[4669]: I1008 21:29:14.108272 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"60e75bf09f786d92a4220856b0c1ccdb7a5f5c2d86e9092f16ff831633898359"} Oct 08 21:29:14 crc kubenswrapper[4669]: I1008 21:29:14.108855 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393"} Oct 08 21:29:14 crc kubenswrapper[4669]: I1008 21:29:14.108885 4669 scope.go:117] "RemoveContainer" containerID="60acc7dbb1e53bf0e8cd1ae99c5c7aa8ca0955f227930c93bc185c254c21dc89" Oct 08 21:29:39 crc kubenswrapper[4669]: I1008 21:29:39.395161 4669 generic.go:334] "Generic (PLEG): container finished" podID="bb78e48d-ddf1-494e-883f-d9987d2f0f0a" containerID="e65fbccd717fbb53480a1dcc055c831e8bea5a4bba01b97a37d6d801ef9b6d65" exitCode=0 Oct 08 21:29:39 crc kubenswrapper[4669]: I1008 21:29:39.395237 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" event={"ID":"bb78e48d-ddf1-494e-883f-d9987d2f0f0a","Type":"ContainerDied","Data":"e65fbccd717fbb53480a1dcc055c831e8bea5a4bba01b97a37d6d801ef9b6d65"} Oct 08 21:29:40 crc kubenswrapper[4669]: I1008 21:29:40.890789 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.044366 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-1\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.044444 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-2\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.044562 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-inventory\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.044785 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-0\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.044971 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-telemetry-combined-ca-bundle\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.045022 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2st\" (UniqueName: \"kubernetes.io/projected/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-kube-api-access-rk2st\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.045078 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ssh-key\") pod \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\" (UID: \"bb78e48d-ddf1-494e-883f-d9987d2f0f0a\") " Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.053484 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-kube-api-access-rk2st" (OuterVolumeSpecName: "kube-api-access-rk2st") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "kube-api-access-rk2st". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.054452 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.082784 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.098461 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.106378 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-inventory" (OuterVolumeSpecName: "inventory") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.111926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.119508 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bb78e48d-ddf1-494e-883f-d9987d2f0f0a" (UID: "bb78e48d-ddf1-494e-883f-d9987d2f0f0a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.148888 4669 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.148943 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2st\" (UniqueName: \"kubernetes.io/projected/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-kube-api-access-rk2st\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.148962 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.148981 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.149002 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.149024 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.149046 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb78e48d-ddf1-494e-883f-d9987d2f0f0a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.418765 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" event={"ID":"bb78e48d-ddf1-494e-883f-d9987d2f0f0a","Type":"ContainerDied","Data":"696143ffbc35ce6be62fe2e3107c96beee3a7a8205d354a1258d4fe7bb139bfb"} Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.419092 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="696143ffbc35ce6be62fe2e3107c96beee3a7a8205d354a1258d4fe7bb139bfb" Oct 08 21:29:41 crc kubenswrapper[4669]: I1008 21:29:41.418998 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.154418 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89"] Oct 08 21:30:00 crc kubenswrapper[4669]: E1008 21:30:00.155400 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb78e48d-ddf1-494e-883f-d9987d2f0f0a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.155415 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb78e48d-ddf1-494e-883f-d9987d2f0f0a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.155663 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb78e48d-ddf1-494e-883f-d9987d2f0f0a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.156314 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.160074 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.160709 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.175912 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89"] Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.262603 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cdf7c27-d0b6-4e20-b278-04dd5f765657-config-volume\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.262673 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9cdg\" (UniqueName: \"kubernetes.io/projected/6cdf7c27-d0b6-4e20-b278-04dd5f765657-kube-api-access-q9cdg\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.262802 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cdf7c27-d0b6-4e20-b278-04dd5f765657-secret-volume\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.368486 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cdf7c27-d0b6-4e20-b278-04dd5f765657-config-volume\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.368576 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9cdg\" (UniqueName: \"kubernetes.io/projected/6cdf7c27-d0b6-4e20-b278-04dd5f765657-kube-api-access-q9cdg\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.368765 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cdf7c27-d0b6-4e20-b278-04dd5f765657-secret-volume\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.369468 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cdf7c27-d0b6-4e20-b278-04dd5f765657-config-volume\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.375061 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cdf7c27-d0b6-4e20-b278-04dd5f765657-secret-volume\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.406718 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9cdg\" (UniqueName: \"kubernetes.io/projected/6cdf7c27-d0b6-4e20-b278-04dd5f765657-kube-api-access-q9cdg\") pod \"collect-profiles-29332650-tjx89\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.474366 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:00 crc kubenswrapper[4669]: W1008 21:30:00.896279 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cdf7c27_d0b6_4e20_b278_04dd5f765657.slice/crio-f69c9343866c62bff09bf7de930abdbf64a1fc77803eecc0b828943fe059ac28 WatchSource:0}: Error finding container f69c9343866c62bff09bf7de930abdbf64a1fc77803eecc0b828943fe059ac28: Status 404 returned error can't find the container with id f69c9343866c62bff09bf7de930abdbf64a1fc77803eecc0b828943fe059ac28 Oct 08 21:30:00 crc kubenswrapper[4669]: I1008 21:30:00.896707 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89"] Oct 08 21:30:01 crc kubenswrapper[4669]: I1008 21:30:01.640696 4669 generic.go:334] "Generic (PLEG): container finished" podID="6cdf7c27-d0b6-4e20-b278-04dd5f765657" containerID="464f1d7b2282d019acea59ccd71c5963a2b70c891c6de8876e1d98ac9c8d184d" exitCode=0 Oct 08 21:30:01 crc kubenswrapper[4669]: I1008 21:30:01.640848 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" event={"ID":"6cdf7c27-d0b6-4e20-b278-04dd5f765657","Type":"ContainerDied","Data":"464f1d7b2282d019acea59ccd71c5963a2b70c891c6de8876e1d98ac9c8d184d"} Oct 08 21:30:01 crc kubenswrapper[4669]: I1008 21:30:01.642271 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" event={"ID":"6cdf7c27-d0b6-4e20-b278-04dd5f765657","Type":"ContainerStarted","Data":"f69c9343866c62bff09bf7de930abdbf64a1fc77803eecc0b828943fe059ac28"} Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.023427 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.119070 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9cdg\" (UniqueName: \"kubernetes.io/projected/6cdf7c27-d0b6-4e20-b278-04dd5f765657-kube-api-access-q9cdg\") pod \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.119168 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cdf7c27-d0b6-4e20-b278-04dd5f765657-secret-volume\") pod \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.119330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cdf7c27-d0b6-4e20-b278-04dd5f765657-config-volume\") pod \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\" (UID: \"6cdf7c27-d0b6-4e20-b278-04dd5f765657\") " Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.119820 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdf7c27-d0b6-4e20-b278-04dd5f765657-config-volume" (OuterVolumeSpecName: "config-volume") pod "6cdf7c27-d0b6-4e20-b278-04dd5f765657" (UID: "6cdf7c27-d0b6-4e20-b278-04dd5f765657"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.124317 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cdf7c27-d0b6-4e20-b278-04dd5f765657-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6cdf7c27-d0b6-4e20-b278-04dd5f765657" (UID: "6cdf7c27-d0b6-4e20-b278-04dd5f765657"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.139735 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cdf7c27-d0b6-4e20-b278-04dd5f765657-kube-api-access-q9cdg" (OuterVolumeSpecName: "kube-api-access-q9cdg") pod "6cdf7c27-d0b6-4e20-b278-04dd5f765657" (UID: "6cdf7c27-d0b6-4e20-b278-04dd5f765657"). InnerVolumeSpecName "kube-api-access-q9cdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.221395 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cdf7c27-d0b6-4e20-b278-04dd5f765657-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.221427 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9cdg\" (UniqueName: \"kubernetes.io/projected/6cdf7c27-d0b6-4e20-b278-04dd5f765657-kube-api-access-q9cdg\") on node \"crc\" DevicePath \"\"" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.221438 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cdf7c27-d0b6-4e20-b278-04dd5f765657-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.664230 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" event={"ID":"6cdf7c27-d0b6-4e20-b278-04dd5f765657","Type":"ContainerDied","Data":"f69c9343866c62bff09bf7de930abdbf64a1fc77803eecc0b828943fe059ac28"} Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.664567 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69c9343866c62bff09bf7de930abdbf64a1fc77803eecc0b828943fe059ac28" Oct 08 21:30:03 crc kubenswrapper[4669]: I1008 21:30:03.664330 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332650-tjx89" Oct 08 21:30:04 crc kubenswrapper[4669]: I1008 21:30:04.113884 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4"] Oct 08 21:30:04 crc kubenswrapper[4669]: I1008 21:30:04.122094 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332605-xxrs4"] Oct 08 21:30:05 crc kubenswrapper[4669]: I1008 21:30:05.352843 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78d1dcb-341a-40b0-a96d-9ee5af65a2fe" path="/var/lib/kubelet/pods/a78d1dcb-341a-40b0-a96d-9ee5af65a2fe/volumes" Oct 08 21:30:07 crc kubenswrapper[4669]: E1008 21:30:07.124334 4669 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.230:56250->38.102.83.230:44177: write tcp 38.102.83.230:56250->38.102.83.230:44177: write: broken pipe Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.700057 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 08 21:30:35 crc kubenswrapper[4669]: E1008 21:30:35.700936 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cdf7c27-d0b6-4e20-b278-04dd5f765657" containerName="collect-profiles" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.700952 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdf7c27-d0b6-4e20-b278-04dd5f765657" containerName="collect-profiles" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.701233 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cdf7c27-d0b6-4e20-b278-04dd5f765657" containerName="collect-profiles" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.701975 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.705427 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.705723 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-crlhk" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.705930 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.706759 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.713943 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897114 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7gr\" (UniqueName: \"kubernetes.io/projected/ad5f7082-536e-477e-a8a3-b5c4945b3b87-kube-api-access-kp7gr\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897510 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897681 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897832 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897862 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897886 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897929 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:35 crc kubenswrapper[4669]: I1008 21:30:35.897980 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000374 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7gr\" (UniqueName: \"kubernetes.io/projected/ad5f7082-536e-477e-a8a3-b5c4945b3b87-kube-api-access-kp7gr\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000432 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000472 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000557 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000590 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000615 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000637 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000678 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.000726 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.001961 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.002073 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.002252 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.002286 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.002400 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.009173 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.009367 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.009703 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.019355 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7gr\" (UniqueName: \"kubernetes.io/projected/ad5f7082-536e-477e-a8a3-b5c4945b3b87-kube-api-access-kp7gr\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.041199 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.343305 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 21:30:36 crc kubenswrapper[4669]: W1008 21:30:36.897342 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad5f7082_536e_477e_a8a3_b5c4945b3b87.slice/crio-943e2bf7a282d4a702eb7af90a696d2278dc5dd951a724a48c2b1932ca722fd8 WatchSource:0}: Error finding container 943e2bf7a282d4a702eb7af90a696d2278dc5dd951a724a48c2b1932ca722fd8: Status 404 returned error can't find the container with id 943e2bf7a282d4a702eb7af90a696d2278dc5dd951a724a48c2b1932ca722fd8 Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.900264 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:30:36 crc kubenswrapper[4669]: I1008 21:30:36.900945 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 08 21:30:37 crc kubenswrapper[4669]: I1008 21:30:37.033514 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad5f7082-536e-477e-a8a3-b5c4945b3b87","Type":"ContainerStarted","Data":"943e2bf7a282d4a702eb7af90a696d2278dc5dd951a724a48c2b1932ca722fd8"} Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.727162 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jhwwd"] Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.731089 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.750699 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jhwwd"] Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.775328 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-utilities\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.775403 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8wlk\" (UniqueName: \"kubernetes.io/projected/7841a99e-d6dc-461f-ac56-78bb96c2773c-kube-api-access-x8wlk\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.775574 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-catalog-content\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.877169 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-catalog-content\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.877345 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-utilities\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.877421 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8wlk\" (UniqueName: \"kubernetes.io/projected/7841a99e-d6dc-461f-ac56-78bb96c2773c-kube-api-access-x8wlk\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.877749 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-catalog-content\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.877758 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-utilities\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:47 crc kubenswrapper[4669]: I1008 21:30:47.905677 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8wlk\" (UniqueName: \"kubernetes.io/projected/7841a99e-d6dc-461f-ac56-78bb96c2773c-kube-api-access-x8wlk\") pod \"community-operators-jhwwd\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:48 crc kubenswrapper[4669]: I1008 21:30:48.053937 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:30:48 crc kubenswrapper[4669]: I1008 21:30:48.638358 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jhwwd"] Oct 08 21:30:59 crc kubenswrapper[4669]: I1008 21:30:59.196205 4669 scope.go:117] "RemoveContainer" containerID="1672a51d12b5dc977486af2465ec6ce43259309301ef95fcc5f579cc5ab4ac8e" Oct 08 21:31:04 crc kubenswrapper[4669]: E1008 21:31:04.719824 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 08 21:31:04 crc kubenswrapper[4669]: E1008 21:31:04.720646 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kp7gr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ad5f7082-536e-477e-a8a3-b5c4945b3b87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 21:31:04 crc kubenswrapper[4669]: E1008 21:31:04.722046 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ad5f7082-536e-477e-a8a3-b5c4945b3b87" Oct 08 21:31:05 crc kubenswrapper[4669]: I1008 21:31:05.292586 4669 generic.go:334] "Generic (PLEG): container finished" podID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerID="a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981" exitCode=0 Oct 08 21:31:05 crc kubenswrapper[4669]: I1008 21:31:05.292636 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerDied","Data":"a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981"} Oct 08 21:31:05 crc kubenswrapper[4669]: I1008 21:31:05.292675 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerStarted","Data":"efc732c1aed80523d2f9ad295f58a624bd28e4d445b3a3e7c03721184742caae"} Oct 08 21:31:05 crc kubenswrapper[4669]: E1008 21:31:05.294499 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ad5f7082-536e-477e-a8a3-b5c4945b3b87" Oct 08 21:31:06 crc kubenswrapper[4669]: I1008 21:31:06.313644 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerStarted","Data":"66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea"} Oct 08 21:31:07 crc kubenswrapper[4669]: I1008 21:31:07.328436 4669 generic.go:334] "Generic (PLEG): container finished" podID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerID="66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea" exitCode=0 Oct 08 21:31:07 crc kubenswrapper[4669]: I1008 21:31:07.328490 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerDied","Data":"66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea"} Oct 08 21:31:08 crc kubenswrapper[4669]: I1008 21:31:08.338318 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerStarted","Data":"b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686"} Oct 08 21:31:08 crc kubenswrapper[4669]: I1008 21:31:08.356610 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jhwwd" podStartSLOduration=18.881235324 podStartE2EDuration="21.356591169s" podCreationTimestamp="2025-10-08 21:30:47 +0000 UTC" firstStartedPulling="2025-10-08 21:31:05.294692466 +0000 UTC m=+2784.987503169" lastFinishedPulling="2025-10-08 21:31:07.770048341 +0000 UTC m=+2787.462859014" observedRunningTime="2025-10-08 21:31:08.353855796 +0000 UTC m=+2788.046666499" watchObservedRunningTime="2025-10-08 21:31:08.356591169 +0000 UTC m=+2788.049401842" Oct 08 21:31:13 crc kubenswrapper[4669]: I1008 21:31:13.186145 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:31:13 crc kubenswrapper[4669]: I1008 21:31:13.187663 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:31:18 crc kubenswrapper[4669]: I1008 21:31:18.054260 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:31:18 crc kubenswrapper[4669]: I1008 21:31:18.054869 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:31:18 crc kubenswrapper[4669]: I1008 21:31:18.098177 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:31:18 crc kubenswrapper[4669]: I1008 21:31:18.480826 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:31:18 crc kubenswrapper[4669]: I1008 21:31:18.921943 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jhwwd"] Oct 08 21:31:20 crc kubenswrapper[4669]: I1008 21:31:20.283605 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 08 21:31:20 crc kubenswrapper[4669]: I1008 21:31:20.457842 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jhwwd" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="registry-server" containerID="cri-o://b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686" gracePeriod=2 Oct 08 21:31:20 crc kubenswrapper[4669]: I1008 21:31:20.938938 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.088068 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8wlk\" (UniqueName: \"kubernetes.io/projected/7841a99e-d6dc-461f-ac56-78bb96c2773c-kube-api-access-x8wlk\") pod \"7841a99e-d6dc-461f-ac56-78bb96c2773c\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.088168 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-utilities\") pod \"7841a99e-d6dc-461f-ac56-78bb96c2773c\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.088446 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-catalog-content\") pod \"7841a99e-d6dc-461f-ac56-78bb96c2773c\" (UID: \"7841a99e-d6dc-461f-ac56-78bb96c2773c\") " Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.089107 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-utilities" (OuterVolumeSpecName: "utilities") pod "7841a99e-d6dc-461f-ac56-78bb96c2773c" (UID: "7841a99e-d6dc-461f-ac56-78bb96c2773c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.094083 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7841a99e-d6dc-461f-ac56-78bb96c2773c-kube-api-access-x8wlk" (OuterVolumeSpecName: "kube-api-access-x8wlk") pod "7841a99e-d6dc-461f-ac56-78bb96c2773c" (UID: "7841a99e-d6dc-461f-ac56-78bb96c2773c"). InnerVolumeSpecName "kube-api-access-x8wlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.141833 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7841a99e-d6dc-461f-ac56-78bb96c2773c" (UID: "7841a99e-d6dc-461f-ac56-78bb96c2773c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.190166 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8wlk\" (UniqueName: \"kubernetes.io/projected/7841a99e-d6dc-461f-ac56-78bb96c2773c-kube-api-access-x8wlk\") on node \"crc\" DevicePath \"\"" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.190204 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.190219 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a99e-d6dc-461f-ac56-78bb96c2773c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.469881 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad5f7082-536e-477e-a8a3-b5c4945b3b87","Type":"ContainerStarted","Data":"c52152722adfd5b12164a01e39f1796d9e063607aa931b5573ab7efa38e61396"} Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.486356 4669 generic.go:334] "Generic (PLEG): container finished" podID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerID="b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686" exitCode=0 Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.486412 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerDied","Data":"b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686"} Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.486446 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhwwd" event={"ID":"7841a99e-d6dc-461f-ac56-78bb96c2773c","Type":"ContainerDied","Data":"efc732c1aed80523d2f9ad295f58a624bd28e4d445b3a3e7c03721184742caae"} Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.486469 4669 scope.go:117] "RemoveContainer" containerID="b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.486655 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhwwd" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.516057 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.1349594849999995 podStartE2EDuration="47.516039217s" podCreationTimestamp="2025-10-08 21:30:34 +0000 UTC" firstStartedPulling="2025-10-08 21:30:36.900012805 +0000 UTC m=+2756.592823478" lastFinishedPulling="2025-10-08 21:31:20.281092537 +0000 UTC m=+2799.973903210" observedRunningTime="2025-10-08 21:31:21.497985434 +0000 UTC m=+2801.190796127" watchObservedRunningTime="2025-10-08 21:31:21.516039217 +0000 UTC m=+2801.208849890" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.522978 4669 scope.go:117] "RemoveContainer" containerID="66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.526364 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jhwwd"] Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.538472 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jhwwd"] Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.555905 4669 scope.go:117] "RemoveContainer" containerID="a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.600481 4669 scope.go:117] "RemoveContainer" containerID="b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686" Oct 08 21:31:21 crc kubenswrapper[4669]: E1008 21:31:21.600970 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686\": container with ID starting with b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686 not found: ID does not exist" containerID="b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.601014 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686"} err="failed to get container status \"b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686\": rpc error: code = NotFound desc = could not find container \"b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686\": container with ID starting with b517b82543ec32642f934258c25e023f6e39442078d167a3699b9c2c1b9e4686 not found: ID does not exist" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.601042 4669 scope.go:117] "RemoveContainer" containerID="66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea" Oct 08 21:31:21 crc kubenswrapper[4669]: E1008 21:31:21.601413 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea\": container with ID starting with 66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea not found: ID does not exist" containerID="66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.601476 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea"} err="failed to get container status \"66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea\": rpc error: code = NotFound desc = could not find container \"66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea\": container with ID starting with 66a487b1b76f2be05847d6ae35c544edf11afaeb4b28ba92b2923d3896b20aea not found: ID does not exist" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.601567 4669 scope.go:117] "RemoveContainer" containerID="a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981" Oct 08 21:31:21 crc kubenswrapper[4669]: E1008 21:31:21.602069 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981\": container with ID starting with a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981 not found: ID does not exist" containerID="a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981" Oct 08 21:31:21 crc kubenswrapper[4669]: I1008 21:31:21.602133 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981"} err="failed to get container status \"a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981\": rpc error: code = NotFound desc = could not find container \"a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981\": container with ID starting with a2de64c5cdbb81b2c0db47d66dc8a9aebcb5c9198fdfad8de3a845e2599e5981 not found: ID does not exist" Oct 08 21:31:23 crc kubenswrapper[4669]: I1008 21:31:23.340597 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" path="/var/lib/kubelet/pods/7841a99e-d6dc-461f-ac56-78bb96c2773c/volumes" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.335839 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zw5"] Oct 08 21:31:24 crc kubenswrapper[4669]: E1008 21:31:24.336279 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="extract-utilities" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.336295 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="extract-utilities" Oct 08 21:31:24 crc kubenswrapper[4669]: E1008 21:31:24.336340 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="extract-content" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.336350 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="extract-content" Oct 08 21:31:24 crc kubenswrapper[4669]: E1008 21:31:24.336366 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="registry-server" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.336374 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="registry-server" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.336706 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7841a99e-d6dc-461f-ac56-78bb96c2773c" containerName="registry-server" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.338248 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.365052 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zw5"] Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.452847 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsgv\" (UniqueName: \"kubernetes.io/projected/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-kube-api-access-2dsgv\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.452892 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-utilities\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.452920 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-catalog-content\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.554660 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsgv\" (UniqueName: \"kubernetes.io/projected/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-kube-api-access-2dsgv\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.555063 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-utilities\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.555112 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-catalog-content\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.555586 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-utilities\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.555690 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-catalog-content\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.577677 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsgv\" (UniqueName: \"kubernetes.io/projected/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-kube-api-access-2dsgv\") pod \"redhat-marketplace-b9zw5\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:24 crc kubenswrapper[4669]: I1008 21:31:24.786012 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:25 crc kubenswrapper[4669]: W1008 21:31:25.274013 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e22a8a1_79fa_400f_bfa6_7937f75d1ee1.slice/crio-f262d03ec9c67387b7bedb26defae2cc0280634bb67848bcbfcc72d64031e992 WatchSource:0}: Error finding container f262d03ec9c67387b7bedb26defae2cc0280634bb67848bcbfcc72d64031e992: Status 404 returned error can't find the container with id f262d03ec9c67387b7bedb26defae2cc0280634bb67848bcbfcc72d64031e992 Oct 08 21:31:25 crc kubenswrapper[4669]: I1008 21:31:25.274811 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zw5"] Oct 08 21:31:25 crc kubenswrapper[4669]: I1008 21:31:25.535015 4669 generic.go:334] "Generic (PLEG): container finished" podID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerID="e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b" exitCode=0 Oct 08 21:31:25 crc kubenswrapper[4669]: I1008 21:31:25.535054 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zw5" event={"ID":"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1","Type":"ContainerDied","Data":"e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b"} Oct 08 21:31:25 crc kubenswrapper[4669]: I1008 21:31:25.535094 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zw5" event={"ID":"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1","Type":"ContainerStarted","Data":"f262d03ec9c67387b7bedb26defae2cc0280634bb67848bcbfcc72d64031e992"} Oct 08 21:31:27 crc kubenswrapper[4669]: I1008 21:31:27.553116 4669 generic.go:334] "Generic (PLEG): container finished" podID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerID="70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c" exitCode=0 Oct 08 21:31:27 crc kubenswrapper[4669]: I1008 21:31:27.553242 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zw5" event={"ID":"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1","Type":"ContainerDied","Data":"70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c"} Oct 08 21:31:28 crc kubenswrapper[4669]: I1008 21:31:28.565123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zw5" event={"ID":"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1","Type":"ContainerStarted","Data":"2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad"} Oct 08 21:31:28 crc kubenswrapper[4669]: I1008 21:31:28.586614 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9zw5" podStartSLOduration=1.823940838 podStartE2EDuration="4.586592437s" podCreationTimestamp="2025-10-08 21:31:24 +0000 UTC" firstStartedPulling="2025-10-08 21:31:25.53761637 +0000 UTC m=+2805.230427043" lastFinishedPulling="2025-10-08 21:31:28.300267979 +0000 UTC m=+2807.993078642" observedRunningTime="2025-10-08 21:31:28.579453466 +0000 UTC m=+2808.272264139" watchObservedRunningTime="2025-10-08 21:31:28.586592437 +0000 UTC m=+2808.279403120" Oct 08 21:31:34 crc kubenswrapper[4669]: I1008 21:31:34.787129 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:34 crc kubenswrapper[4669]: I1008 21:31:34.787776 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:34 crc kubenswrapper[4669]: I1008 21:31:34.854226 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:35 crc kubenswrapper[4669]: I1008 21:31:35.706185 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:35 crc kubenswrapper[4669]: I1008 21:31:35.762415 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zw5"] Oct 08 21:31:37 crc kubenswrapper[4669]: I1008 21:31:37.651194 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9zw5" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="registry-server" containerID="cri-o://2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad" gracePeriod=2 Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.175786 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.230062 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-utilities\") pod \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.230160 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-catalog-content\") pod \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.230245 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsgv\" (UniqueName: \"kubernetes.io/projected/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-kube-api-access-2dsgv\") pod \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\" (UID: \"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1\") " Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.231663 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-utilities" (OuterVolumeSpecName: "utilities") pod "6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" (UID: "6e22a8a1-79fa-400f-bfa6-7937f75d1ee1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.241015 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-kube-api-access-2dsgv" (OuterVolumeSpecName: "kube-api-access-2dsgv") pod "6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" (UID: "6e22a8a1-79fa-400f-bfa6-7937f75d1ee1"). InnerVolumeSpecName "kube-api-access-2dsgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.246751 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" (UID: "6e22a8a1-79fa-400f-bfa6-7937f75d1ee1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.331935 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.332163 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.332175 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsgv\" (UniqueName: \"kubernetes.io/projected/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1-kube-api-access-2dsgv\") on node \"crc\" DevicePath \"\"" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.666788 4669 generic.go:334] "Generic (PLEG): container finished" podID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerID="2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad" exitCode=0 Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.666836 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zw5" event={"ID":"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1","Type":"ContainerDied","Data":"2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad"} Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.666855 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zw5" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.666866 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zw5" event={"ID":"6e22a8a1-79fa-400f-bfa6-7937f75d1ee1","Type":"ContainerDied","Data":"f262d03ec9c67387b7bedb26defae2cc0280634bb67848bcbfcc72d64031e992"} Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.666885 4669 scope.go:117] "RemoveContainer" containerID="2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.716837 4669 scope.go:117] "RemoveContainer" containerID="70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.723592 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zw5"] Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.735845 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zw5"] Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.794779 4669 scope.go:117] "RemoveContainer" containerID="e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.850949 4669 scope.go:117] "RemoveContainer" containerID="2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad" Oct 08 21:31:38 crc kubenswrapper[4669]: E1008 21:31:38.851397 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad\": container with ID starting with 2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad not found: ID does not exist" containerID="2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.851430 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad"} err="failed to get container status \"2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad\": rpc error: code = NotFound desc = could not find container \"2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad\": container with ID starting with 2e25a7a79090a93435b1e10b3f4084be949ccd669c1a5479b61fa006742488ad not found: ID does not exist" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.851451 4669 scope.go:117] "RemoveContainer" containerID="70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c" Oct 08 21:31:38 crc kubenswrapper[4669]: E1008 21:31:38.853007 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c\": container with ID starting with 70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c not found: ID does not exist" containerID="70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.853034 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c"} err="failed to get container status \"70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c\": rpc error: code = NotFound desc = could not find container \"70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c\": container with ID starting with 70cac8db155ccc0347c6b8cf1f8b439a4d06ffd78338e11b435edbabcb21d89c not found: ID does not exist" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.853049 4669 scope.go:117] "RemoveContainer" containerID="e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b" Oct 08 21:31:38 crc kubenswrapper[4669]: E1008 21:31:38.853279 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b\": container with ID starting with e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b not found: ID does not exist" containerID="e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b" Oct 08 21:31:38 crc kubenswrapper[4669]: I1008 21:31:38.853297 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b"} err="failed to get container status \"e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b\": rpc error: code = NotFound desc = could not find container \"e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b\": container with ID starting with e2c19d51970998a6ee9360fc68c5d03ba95820501c74d43a2dca0469bd54eb4b not found: ID does not exist" Oct 08 21:31:39 crc kubenswrapper[4669]: I1008 21:31:39.345225 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" path="/var/lib/kubelet/pods/6e22a8a1-79fa-400f-bfa6-7937f75d1ee1/volumes" Oct 08 21:31:43 crc kubenswrapper[4669]: I1008 21:31:43.185708 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:31:43 crc kubenswrapper[4669]: I1008 21:31:43.186380 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:32:13 crc kubenswrapper[4669]: I1008 21:32:13.185838 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:32:13 crc kubenswrapper[4669]: I1008 21:32:13.186348 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:32:13 crc kubenswrapper[4669]: I1008 21:32:13.186386 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:32:13 crc kubenswrapper[4669]: I1008 21:32:13.187083 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:32:13 crc kubenswrapper[4669]: I1008 21:32:13.187154 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" gracePeriod=600 Oct 08 21:32:13 crc kubenswrapper[4669]: E1008 21:32:13.310864 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:32:14 crc kubenswrapper[4669]: I1008 21:32:14.024362 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" exitCode=0 Oct 08 21:32:14 crc kubenswrapper[4669]: I1008 21:32:14.024727 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393"} Oct 08 21:32:14 crc kubenswrapper[4669]: I1008 21:32:14.024761 4669 scope.go:117] "RemoveContainer" containerID="60e75bf09f786d92a4220856b0c1ccdb7a5f5c2d86e9092f16ff831633898359" Oct 08 21:32:14 crc kubenswrapper[4669]: I1008 21:32:14.025367 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:32:14 crc kubenswrapper[4669]: E1008 21:32:14.025665 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:32:26 crc kubenswrapper[4669]: I1008 21:32:26.330835 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:32:26 crc kubenswrapper[4669]: E1008 21:32:26.331601 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:32:40 crc kubenswrapper[4669]: I1008 21:32:40.331295 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:32:40 crc kubenswrapper[4669]: E1008 21:32:40.332024 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:32:51 crc kubenswrapper[4669]: I1008 21:32:51.342639 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:32:51 crc kubenswrapper[4669]: E1008 21:32:51.343707 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:33:05 crc kubenswrapper[4669]: I1008 21:33:05.334560 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:33:05 crc kubenswrapper[4669]: E1008 21:33:05.335575 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:33:17 crc kubenswrapper[4669]: I1008 21:33:17.331214 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:33:17 crc kubenswrapper[4669]: E1008 21:33:17.332094 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:33:31 crc kubenswrapper[4669]: I1008 21:33:31.336134 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:33:31 crc kubenswrapper[4669]: E1008 21:33:31.337369 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:33:45 crc kubenswrapper[4669]: I1008 21:33:45.331357 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:33:45 crc kubenswrapper[4669]: E1008 21:33:45.332101 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:33:59 crc kubenswrapper[4669]: I1008 21:33:59.330997 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:33:59 crc kubenswrapper[4669]: E1008 21:33:59.331733 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:34:13 crc kubenswrapper[4669]: I1008 21:34:13.331722 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:34:13 crc kubenswrapper[4669]: E1008 21:34:13.332684 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:34:24 crc kubenswrapper[4669]: I1008 21:34:24.330702 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:34:24 crc kubenswrapper[4669]: E1008 21:34:24.332784 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:34:35 crc kubenswrapper[4669]: I1008 21:34:35.330560 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:34:35 crc kubenswrapper[4669]: E1008 21:34:35.331220 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:34:49 crc kubenswrapper[4669]: I1008 21:34:49.332925 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:34:49 crc kubenswrapper[4669]: E1008 21:34:49.333822 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:35:00 crc kubenswrapper[4669]: I1008 21:35:00.330825 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:35:00 crc kubenswrapper[4669]: E1008 21:35:00.331733 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:35:11 crc kubenswrapper[4669]: I1008 21:35:11.346269 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:35:11 crc kubenswrapper[4669]: E1008 21:35:11.347485 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:35:23 crc kubenswrapper[4669]: I1008 21:35:23.330647 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:35:23 crc kubenswrapper[4669]: E1008 21:35:23.331410 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:35:36 crc kubenswrapper[4669]: I1008 21:35:36.331762 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:35:36 crc kubenswrapper[4669]: E1008 21:35:36.333236 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:35:50 crc kubenswrapper[4669]: I1008 21:35:50.330777 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:35:50 crc kubenswrapper[4669]: E1008 21:35:50.331892 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:36:01 crc kubenswrapper[4669]: I1008 21:36:01.341260 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:36:01 crc kubenswrapper[4669]: E1008 21:36:01.342094 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:36:15 crc kubenswrapper[4669]: I1008 21:36:15.331391 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:36:15 crc kubenswrapper[4669]: E1008 21:36:15.332169 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:36:28 crc kubenswrapper[4669]: I1008 21:36:28.331035 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:36:28 crc kubenswrapper[4669]: E1008 21:36:28.331783 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:36:41 crc kubenswrapper[4669]: I1008 21:36:41.337022 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:36:41 crc kubenswrapper[4669]: E1008 21:36:41.337741 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:36:54 crc kubenswrapper[4669]: I1008 21:36:54.330874 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:36:54 crc kubenswrapper[4669]: E1008 21:36:54.331867 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:37:07 crc kubenswrapper[4669]: I1008 21:37:07.331522 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:37:07 crc kubenswrapper[4669]: E1008 21:37:07.332419 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:37:19 crc kubenswrapper[4669]: I1008 21:37:19.331071 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:37:20 crc kubenswrapper[4669]: I1008 21:37:20.156124 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"04b13f640358f5e7909dc5cd3cf3c01285108fb23136ef56f11945221a6b0387"} Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.126466 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qznqc"] Oct 08 21:37:37 crc kubenswrapper[4669]: E1008 21:37:37.127423 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="extract-utilities" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.127437 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="extract-utilities" Oct 08 21:37:37 crc kubenswrapper[4669]: E1008 21:37:37.127452 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="registry-server" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.127458 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="registry-server" Oct 08 21:37:37 crc kubenswrapper[4669]: E1008 21:37:37.127468 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="extract-content" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.127474 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="extract-content" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.127694 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e22a8a1-79fa-400f-bfa6-7937f75d1ee1" containerName="registry-server" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.129175 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.135094 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qznqc"] Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.202101 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xkq\" (UniqueName: \"kubernetes.io/projected/784e1172-64f2-4ee5-b467-5b5f20d0db4c-kube-api-access-77xkq\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.202229 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-utilities\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.202260 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-catalog-content\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.303021 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-utilities\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.303071 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-catalog-content\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.303145 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77xkq\" (UniqueName: \"kubernetes.io/projected/784e1172-64f2-4ee5-b467-5b5f20d0db4c-kube-api-access-77xkq\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.303956 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-utilities\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.303991 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-catalog-content\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.327391 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xkq\" (UniqueName: \"kubernetes.io/projected/784e1172-64f2-4ee5-b467-5b5f20d0db4c-kube-api-access-77xkq\") pod \"certified-operators-qznqc\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:37 crc kubenswrapper[4669]: I1008 21:37:37.447085 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:38 crc kubenswrapper[4669]: I1008 21:37:38.034401 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qznqc"] Oct 08 21:37:38 crc kubenswrapper[4669]: W1008 21:37:38.042937 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784e1172_64f2_4ee5_b467_5b5f20d0db4c.slice/crio-bc9d2c3315e95b8ba6bbdc74ec9cb7bf3de1e876010d6adf96f72369da7d4c46 WatchSource:0}: Error finding container bc9d2c3315e95b8ba6bbdc74ec9cb7bf3de1e876010d6adf96f72369da7d4c46: Status 404 returned error can't find the container with id bc9d2c3315e95b8ba6bbdc74ec9cb7bf3de1e876010d6adf96f72369da7d4c46 Oct 08 21:37:38 crc kubenswrapper[4669]: I1008 21:37:38.337317 4669 generic.go:334] "Generic (PLEG): container finished" podID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerID="c8131f080d36ed200d9b2707deeb925f5977650ca909ebda8bddc9b687cfe0c8" exitCode=0 Oct 08 21:37:38 crc kubenswrapper[4669]: I1008 21:37:38.337366 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerDied","Data":"c8131f080d36ed200d9b2707deeb925f5977650ca909ebda8bddc9b687cfe0c8"} Oct 08 21:37:38 crc kubenswrapper[4669]: I1008 21:37:38.337875 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerStarted","Data":"bc9d2c3315e95b8ba6bbdc74ec9cb7bf3de1e876010d6adf96f72369da7d4c46"} Oct 08 21:37:38 crc kubenswrapper[4669]: I1008 21:37:38.338775 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:37:39 crc kubenswrapper[4669]: I1008 21:37:39.348093 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerStarted","Data":"aca32dc074cd854a0798aba993c125155db4de928820bba5e3aee86d432e2e49"} Oct 08 21:37:40 crc kubenswrapper[4669]: I1008 21:37:40.362990 4669 generic.go:334] "Generic (PLEG): container finished" podID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerID="aca32dc074cd854a0798aba993c125155db4de928820bba5e3aee86d432e2e49" exitCode=0 Oct 08 21:37:40 crc kubenswrapper[4669]: I1008 21:37:40.363037 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerDied","Data":"aca32dc074cd854a0798aba993c125155db4de928820bba5e3aee86d432e2e49"} Oct 08 21:37:41 crc kubenswrapper[4669]: I1008 21:37:41.375676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerStarted","Data":"5577bed44d102de6a4ff96d3bcfbe36f9a9275de48fe1b597f5dd9aa00d8d11d"} Oct 08 21:37:41 crc kubenswrapper[4669]: I1008 21:37:41.408333 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qznqc" podStartSLOduration=1.8383703329999999 podStartE2EDuration="4.408314943s" podCreationTimestamp="2025-10-08 21:37:37 +0000 UTC" firstStartedPulling="2025-10-08 21:37:38.338453523 +0000 UTC m=+3178.031264196" lastFinishedPulling="2025-10-08 21:37:40.908398113 +0000 UTC m=+3180.601208806" observedRunningTime="2025-10-08 21:37:41.403772786 +0000 UTC m=+3181.096583459" watchObservedRunningTime="2025-10-08 21:37:41.408314943 +0000 UTC m=+3181.101125616" Oct 08 21:37:41 crc kubenswrapper[4669]: I1008 21:37:41.917158 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9zfgl"] Oct 08 21:37:41 crc kubenswrapper[4669]: I1008 21:37:41.919398 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:41 crc kubenswrapper[4669]: I1008 21:37:41.933771 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9zfgl"] Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.016224 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-utilities\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.016592 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-catalog-content\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.016799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpbk\" (UniqueName: \"kubernetes.io/projected/6e340051-cefe-4df2-9924-101fa04e3d27-kube-api-access-jjpbk\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.118829 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpbk\" (UniqueName: \"kubernetes.io/projected/6e340051-cefe-4df2-9924-101fa04e3d27-kube-api-access-jjpbk\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.119182 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-utilities\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.119283 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-catalog-content\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.119702 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-utilities\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.119749 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-catalog-content\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.141618 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpbk\" (UniqueName: \"kubernetes.io/projected/6e340051-cefe-4df2-9924-101fa04e3d27-kube-api-access-jjpbk\") pod \"redhat-operators-9zfgl\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.249790 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:42 crc kubenswrapper[4669]: I1008 21:37:42.729674 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9zfgl"] Oct 08 21:37:42 crc kubenswrapper[4669]: W1008 21:37:42.739933 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e340051_cefe_4df2_9924_101fa04e3d27.slice/crio-ed02c72201612047ee08150dcea8c69a7e53d3fb84ff2f9aff0c54aec5bb09c8 WatchSource:0}: Error finding container ed02c72201612047ee08150dcea8c69a7e53d3fb84ff2f9aff0c54aec5bb09c8: Status 404 returned error can't find the container with id ed02c72201612047ee08150dcea8c69a7e53d3fb84ff2f9aff0c54aec5bb09c8 Oct 08 21:37:43 crc kubenswrapper[4669]: I1008 21:37:43.409351 4669 generic.go:334] "Generic (PLEG): container finished" podID="6e340051-cefe-4df2-9924-101fa04e3d27" containerID="42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99" exitCode=0 Oct 08 21:37:43 crc kubenswrapper[4669]: I1008 21:37:43.409406 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerDied","Data":"42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99"} Oct 08 21:37:43 crc kubenswrapper[4669]: I1008 21:37:43.409443 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerStarted","Data":"ed02c72201612047ee08150dcea8c69a7e53d3fb84ff2f9aff0c54aec5bb09c8"} Oct 08 21:37:44 crc kubenswrapper[4669]: I1008 21:37:44.431357 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerStarted","Data":"bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa"} Oct 08 21:37:47 crc kubenswrapper[4669]: I1008 21:37:47.447681 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:47 crc kubenswrapper[4669]: I1008 21:37:47.448086 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:47 crc kubenswrapper[4669]: I1008 21:37:47.492196 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:47 crc kubenswrapper[4669]: I1008 21:37:47.544774 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:48 crc kubenswrapper[4669]: I1008 21:37:48.468083 4669 generic.go:334] "Generic (PLEG): container finished" podID="6e340051-cefe-4df2-9924-101fa04e3d27" containerID="bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa" exitCode=0 Oct 08 21:37:48 crc kubenswrapper[4669]: I1008 21:37:48.468159 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerDied","Data":"bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa"} Oct 08 21:37:49 crc kubenswrapper[4669]: I1008 21:37:49.478379 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerStarted","Data":"12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292"} Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.104459 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9zfgl" podStartSLOduration=4.65612291 podStartE2EDuration="10.104440688s" podCreationTimestamp="2025-10-08 21:37:41 +0000 UTC" firstStartedPulling="2025-10-08 21:37:43.411786769 +0000 UTC m=+3183.104597442" lastFinishedPulling="2025-10-08 21:37:48.860104537 +0000 UTC m=+3188.552915220" observedRunningTime="2025-10-08 21:37:49.499172933 +0000 UTC m=+3189.191983646" watchObservedRunningTime="2025-10-08 21:37:51.104440688 +0000 UTC m=+3190.797251361" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.108978 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qznqc"] Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.109201 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qznqc" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="registry-server" containerID="cri-o://5577bed44d102de6a4ff96d3bcfbe36f9a9275de48fe1b597f5dd9aa00d8d11d" gracePeriod=2 Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.511608 4669 generic.go:334] "Generic (PLEG): container finished" podID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerID="5577bed44d102de6a4ff96d3bcfbe36f9a9275de48fe1b597f5dd9aa00d8d11d" exitCode=0 Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.511730 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerDied","Data":"5577bed44d102de6a4ff96d3bcfbe36f9a9275de48fe1b597f5dd9aa00d8d11d"} Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.655013 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.692737 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77xkq\" (UniqueName: \"kubernetes.io/projected/784e1172-64f2-4ee5-b467-5b5f20d0db4c-kube-api-access-77xkq\") pod \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.692834 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-catalog-content\") pod \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.692976 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-utilities\") pod \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\" (UID: \"784e1172-64f2-4ee5-b467-5b5f20d0db4c\") " Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.694241 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-utilities" (OuterVolumeSpecName: "utilities") pod "784e1172-64f2-4ee5-b467-5b5f20d0db4c" (UID: "784e1172-64f2-4ee5-b467-5b5f20d0db4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.702795 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784e1172-64f2-4ee5-b467-5b5f20d0db4c-kube-api-access-77xkq" (OuterVolumeSpecName: "kube-api-access-77xkq") pod "784e1172-64f2-4ee5-b467-5b5f20d0db4c" (UID: "784e1172-64f2-4ee5-b467-5b5f20d0db4c"). InnerVolumeSpecName "kube-api-access-77xkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.760262 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "784e1172-64f2-4ee5-b467-5b5f20d0db4c" (UID: "784e1172-64f2-4ee5-b467-5b5f20d0db4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.794965 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77xkq\" (UniqueName: \"kubernetes.io/projected/784e1172-64f2-4ee5-b467-5b5f20d0db4c-kube-api-access-77xkq\") on node \"crc\" DevicePath \"\"" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.794995 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:37:51 crc kubenswrapper[4669]: I1008 21:37:51.795003 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784e1172-64f2-4ee5-b467-5b5f20d0db4c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.250371 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.250446 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.523862 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznqc" event={"ID":"784e1172-64f2-4ee5-b467-5b5f20d0db4c","Type":"ContainerDied","Data":"bc9d2c3315e95b8ba6bbdc74ec9cb7bf3de1e876010d6adf96f72369da7d4c46"} Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.524211 4669 scope.go:117] "RemoveContainer" containerID="5577bed44d102de6a4ff96d3bcfbe36f9a9275de48fe1b597f5dd9aa00d8d11d" Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.523949 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznqc" Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.559490 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qznqc"] Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.563352 4669 scope.go:117] "RemoveContainer" containerID="aca32dc074cd854a0798aba993c125155db4de928820bba5e3aee86d432e2e49" Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.572227 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qznqc"] Oct 08 21:37:52 crc kubenswrapper[4669]: I1008 21:37:52.597616 4669 scope.go:117] "RemoveContainer" containerID="c8131f080d36ed200d9b2707deeb925f5977650ca909ebda8bddc9b687cfe0c8" Oct 08 21:37:53 crc kubenswrapper[4669]: I1008 21:37:53.310695 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9zfgl" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="registry-server" probeResult="failure" output=< Oct 08 21:37:53 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 21:37:53 crc kubenswrapper[4669]: > Oct 08 21:37:53 crc kubenswrapper[4669]: I1008 21:37:53.341141 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" path="/var/lib/kubelet/pods/784e1172-64f2-4ee5-b467-5b5f20d0db4c/volumes" Oct 08 21:38:02 crc kubenswrapper[4669]: I1008 21:38:02.301939 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:38:02 crc kubenswrapper[4669]: I1008 21:38:02.365401 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:38:03 crc kubenswrapper[4669]: I1008 21:38:03.265925 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9zfgl"] Oct 08 21:38:03 crc kubenswrapper[4669]: I1008 21:38:03.623611 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9zfgl" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="registry-server" containerID="cri-o://12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292" gracePeriod=2 Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.087322 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.258659 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjpbk\" (UniqueName: \"kubernetes.io/projected/6e340051-cefe-4df2-9924-101fa04e3d27-kube-api-access-jjpbk\") pod \"6e340051-cefe-4df2-9924-101fa04e3d27\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.258977 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-utilities\") pod \"6e340051-cefe-4df2-9924-101fa04e3d27\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.259041 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-catalog-content\") pod \"6e340051-cefe-4df2-9924-101fa04e3d27\" (UID: \"6e340051-cefe-4df2-9924-101fa04e3d27\") " Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.260039 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-utilities" (OuterVolumeSpecName: "utilities") pod "6e340051-cefe-4df2-9924-101fa04e3d27" (UID: "6e340051-cefe-4df2-9924-101fa04e3d27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.266772 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e340051-cefe-4df2-9924-101fa04e3d27-kube-api-access-jjpbk" (OuterVolumeSpecName: "kube-api-access-jjpbk") pod "6e340051-cefe-4df2-9924-101fa04e3d27" (UID: "6e340051-cefe-4df2-9924-101fa04e3d27"). InnerVolumeSpecName "kube-api-access-jjpbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.341981 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e340051-cefe-4df2-9924-101fa04e3d27" (UID: "6e340051-cefe-4df2-9924-101fa04e3d27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.361505 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjpbk\" (UniqueName: \"kubernetes.io/projected/6e340051-cefe-4df2-9924-101fa04e3d27-kube-api-access-jjpbk\") on node \"crc\" DevicePath \"\"" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.361572 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.361587 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e340051-cefe-4df2-9924-101fa04e3d27-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.635641 4669 generic.go:334] "Generic (PLEG): container finished" podID="6e340051-cefe-4df2-9924-101fa04e3d27" containerID="12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292" exitCode=0 Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.635955 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerDied","Data":"12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292"} Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.635986 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zfgl" event={"ID":"6e340051-cefe-4df2-9924-101fa04e3d27","Type":"ContainerDied","Data":"ed02c72201612047ee08150dcea8c69a7e53d3fb84ff2f9aff0c54aec5bb09c8"} Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.636006 4669 scope.go:117] "RemoveContainer" containerID="12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.636149 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zfgl" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.677418 4669 scope.go:117] "RemoveContainer" containerID="bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.683165 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9zfgl"] Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.694987 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9zfgl"] Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.705842 4669 scope.go:117] "RemoveContainer" containerID="42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.749760 4669 scope.go:117] "RemoveContainer" containerID="12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292" Oct 08 21:38:04 crc kubenswrapper[4669]: E1008 21:38:04.750221 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292\": container with ID starting with 12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292 not found: ID does not exist" containerID="12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.750256 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292"} err="failed to get container status \"12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292\": rpc error: code = NotFound desc = could not find container \"12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292\": container with ID starting with 12e5ca1c9d9066fb9e5c43562b9f3e669350df9fd0ea4a2d67fa84f9fb72b292 not found: ID does not exist" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.750279 4669 scope.go:117] "RemoveContainer" containerID="bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa" Oct 08 21:38:04 crc kubenswrapper[4669]: E1008 21:38:04.750860 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa\": container with ID starting with bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa not found: ID does not exist" containerID="bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.750899 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa"} err="failed to get container status \"bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa\": rpc error: code = NotFound desc = could not find container \"bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa\": container with ID starting with bd98229eae02792161a561842c8f257783a0b9c9e711fea6c290ec5fe635e7fa not found: ID does not exist" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.750920 4669 scope.go:117] "RemoveContainer" containerID="42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99" Oct 08 21:38:04 crc kubenswrapper[4669]: E1008 21:38:04.751176 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99\": container with ID starting with 42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99 not found: ID does not exist" containerID="42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99" Oct 08 21:38:04 crc kubenswrapper[4669]: I1008 21:38:04.751204 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99"} err="failed to get container status \"42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99\": rpc error: code = NotFound desc = could not find container \"42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99\": container with ID starting with 42f0eccb01e716717da3cbabf155ddeffe05a36ae712ad3c3c59ee9ef3b5bb99 not found: ID does not exist" Oct 08 21:38:05 crc kubenswrapper[4669]: I1008 21:38:05.346440 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" path="/var/lib/kubelet/pods/6e340051-cefe-4df2-9924-101fa04e3d27/volumes" Oct 08 21:39:43 crc kubenswrapper[4669]: I1008 21:39:43.185223 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:39:43 crc kubenswrapper[4669]: I1008 21:39:43.185984 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:40:13 crc kubenswrapper[4669]: I1008 21:40:13.185156 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:40:13 crc kubenswrapper[4669]: I1008 21:40:13.185747 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:40:43 crc kubenswrapper[4669]: I1008 21:40:43.185064 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:40:43 crc kubenswrapper[4669]: I1008 21:40:43.185891 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:40:43 crc kubenswrapper[4669]: I1008 21:40:43.185955 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:40:43 crc kubenswrapper[4669]: I1008 21:40:43.187159 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04b13f640358f5e7909dc5cd3cf3c01285108fb23136ef56f11945221a6b0387"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:40:43 crc kubenswrapper[4669]: I1008 21:40:43.187269 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://04b13f640358f5e7909dc5cd3cf3c01285108fb23136ef56f11945221a6b0387" gracePeriod=600 Oct 08 21:40:44 crc kubenswrapper[4669]: I1008 21:40:44.257119 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="04b13f640358f5e7909dc5cd3cf3c01285108fb23136ef56f11945221a6b0387" exitCode=0 Oct 08 21:40:44 crc kubenswrapper[4669]: I1008 21:40:44.257184 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"04b13f640358f5e7909dc5cd3cf3c01285108fb23136ef56f11945221a6b0387"} Oct 08 21:40:44 crc kubenswrapper[4669]: I1008 21:40:44.257791 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286"} Oct 08 21:40:44 crc kubenswrapper[4669]: I1008 21:40:44.257818 4669 scope.go:117] "RemoveContainer" containerID="945c8aba532a5316a189d943437ad63c04b6117c58cb6f632af5eb4c02b10393" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.188734 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qqsvf"] Oct 08 21:41:26 crc kubenswrapper[4669]: E1008 21:41:26.189762 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="extract-utilities" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.189779 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="extract-utilities" Oct 08 21:41:26 crc kubenswrapper[4669]: E1008 21:41:26.189801 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="extract-content" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.189809 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="extract-content" Oct 08 21:41:26 crc kubenswrapper[4669]: E1008 21:41:26.189829 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="extract-utilities" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.189837 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="extract-utilities" Oct 08 21:41:26 crc kubenswrapper[4669]: E1008 21:41:26.189861 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="registry-server" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.189870 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="registry-server" Oct 08 21:41:26 crc kubenswrapper[4669]: E1008 21:41:26.189885 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="extract-content" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.189893 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="extract-content" Oct 08 21:41:26 crc kubenswrapper[4669]: E1008 21:41:26.189906 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="registry-server" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.189916 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="registry-server" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.190169 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e340051-cefe-4df2-9924-101fa04e3d27" containerName="registry-server" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.190184 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="784e1172-64f2-4ee5-b467-5b5f20d0db4c" containerName="registry-server" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.191799 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.209730 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqsvf"] Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.323049 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-catalog-content\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.323113 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrz7\" (UniqueName: \"kubernetes.io/projected/f3805932-16ff-49c5-b43a-ee24a36da57b-kube-api-access-xzrz7\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.323306 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-utilities\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.425212 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-utilities\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.425416 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-catalog-content\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.425469 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrz7\" (UniqueName: \"kubernetes.io/projected/f3805932-16ff-49c5-b43a-ee24a36da57b-kube-api-access-xzrz7\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.426021 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-catalog-content\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.426093 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-utilities\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.444142 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrz7\" (UniqueName: \"kubernetes.io/projected/f3805932-16ff-49c5-b43a-ee24a36da57b-kube-api-access-xzrz7\") pod \"redhat-marketplace-qqsvf\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:26 crc kubenswrapper[4669]: I1008 21:41:26.509417 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:27 crc kubenswrapper[4669]: I1008 21:41:27.005493 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqsvf"] Oct 08 21:41:27 crc kubenswrapper[4669]: I1008 21:41:27.687127 4669 generic.go:334] "Generic (PLEG): container finished" podID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerID="4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db" exitCode=0 Oct 08 21:41:27 crc kubenswrapper[4669]: I1008 21:41:27.687239 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqsvf" event={"ID":"f3805932-16ff-49c5-b43a-ee24a36da57b","Type":"ContainerDied","Data":"4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db"} Oct 08 21:41:27 crc kubenswrapper[4669]: I1008 21:41:27.687801 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqsvf" event={"ID":"f3805932-16ff-49c5-b43a-ee24a36da57b","Type":"ContainerStarted","Data":"48d6c1f29ff56629efc38eda8e712e19623cd272fddec0e8ddf0dc013d54053e"} Oct 08 21:41:28 crc kubenswrapper[4669]: I1008 21:41:28.700385 4669 generic.go:334] "Generic (PLEG): container finished" podID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerID="a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18" exitCode=0 Oct 08 21:41:28 crc kubenswrapper[4669]: I1008 21:41:28.700439 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqsvf" event={"ID":"f3805932-16ff-49c5-b43a-ee24a36da57b","Type":"ContainerDied","Data":"a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18"} Oct 08 21:41:29 crc kubenswrapper[4669]: I1008 21:41:29.729175 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqsvf" event={"ID":"f3805932-16ff-49c5-b43a-ee24a36da57b","Type":"ContainerStarted","Data":"17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052"} Oct 08 21:41:29 crc kubenswrapper[4669]: I1008 21:41:29.754294 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qqsvf" podStartSLOduration=2.104110962 podStartE2EDuration="3.75427696s" podCreationTimestamp="2025-10-08 21:41:26 +0000 UTC" firstStartedPulling="2025-10-08 21:41:27.696876609 +0000 UTC m=+3407.389687322" lastFinishedPulling="2025-10-08 21:41:29.347042637 +0000 UTC m=+3409.039853320" observedRunningTime="2025-10-08 21:41:29.746437591 +0000 UTC m=+3409.439248284" watchObservedRunningTime="2025-10-08 21:41:29.75427696 +0000 UTC m=+3409.447087633" Oct 08 21:41:36 crc kubenswrapper[4669]: I1008 21:41:36.510449 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:36 crc kubenswrapper[4669]: I1008 21:41:36.511146 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:36 crc kubenswrapper[4669]: I1008 21:41:36.562134 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:36 crc kubenswrapper[4669]: I1008 21:41:36.852426 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:36 crc kubenswrapper[4669]: I1008 21:41:36.904827 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqsvf"] Oct 08 21:41:38 crc kubenswrapper[4669]: I1008 21:41:38.812611 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qqsvf" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="registry-server" containerID="cri-o://17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052" gracePeriod=2 Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.355070 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.485505 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrz7\" (UniqueName: \"kubernetes.io/projected/f3805932-16ff-49c5-b43a-ee24a36da57b-kube-api-access-xzrz7\") pod \"f3805932-16ff-49c5-b43a-ee24a36da57b\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.485703 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-utilities\") pod \"f3805932-16ff-49c5-b43a-ee24a36da57b\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.485852 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-catalog-content\") pod \"f3805932-16ff-49c5-b43a-ee24a36da57b\" (UID: \"f3805932-16ff-49c5-b43a-ee24a36da57b\") " Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.486857 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-utilities" (OuterVolumeSpecName: "utilities") pod "f3805932-16ff-49c5-b43a-ee24a36da57b" (UID: "f3805932-16ff-49c5-b43a-ee24a36da57b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.498294 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3805932-16ff-49c5-b43a-ee24a36da57b" (UID: "f3805932-16ff-49c5-b43a-ee24a36da57b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.498352 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3805932-16ff-49c5-b43a-ee24a36da57b-kube-api-access-xzrz7" (OuterVolumeSpecName: "kube-api-access-xzrz7") pod "f3805932-16ff-49c5-b43a-ee24a36da57b" (UID: "f3805932-16ff-49c5-b43a-ee24a36da57b"). InnerVolumeSpecName "kube-api-access-xzrz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.587695 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.587735 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzrz7\" (UniqueName: \"kubernetes.io/projected/f3805932-16ff-49c5-b43a-ee24a36da57b-kube-api-access-xzrz7\") on node \"crc\" DevicePath \"\"" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.587750 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3805932-16ff-49c5-b43a-ee24a36da57b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.825775 4669 generic.go:334] "Generic (PLEG): container finished" podID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerID="17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052" exitCode=0 Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.825834 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqsvf" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.825851 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqsvf" event={"ID":"f3805932-16ff-49c5-b43a-ee24a36da57b","Type":"ContainerDied","Data":"17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052"} Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.826064 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqsvf" event={"ID":"f3805932-16ff-49c5-b43a-ee24a36da57b","Type":"ContainerDied","Data":"48d6c1f29ff56629efc38eda8e712e19623cd272fddec0e8ddf0dc013d54053e"} Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.826093 4669 scope.go:117] "RemoveContainer" containerID="17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.842289 4669 scope.go:117] "RemoveContainer" containerID="a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.859946 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqsvf"] Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.873687 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqsvf"] Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.880880 4669 scope.go:117] "RemoveContainer" containerID="4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.917771 4669 scope.go:117] "RemoveContainer" containerID="17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052" Oct 08 21:41:39 crc kubenswrapper[4669]: E1008 21:41:39.918175 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052\": container with ID starting with 17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052 not found: ID does not exist" containerID="17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.918206 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052"} err="failed to get container status \"17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052\": rpc error: code = NotFound desc = could not find container \"17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052\": container with ID starting with 17ac764065ba4c92f423c2fb506763ed055f824a42362bc2fb60d38be65f8052 not found: ID does not exist" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.918227 4669 scope.go:117] "RemoveContainer" containerID="a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18" Oct 08 21:41:39 crc kubenswrapper[4669]: E1008 21:41:39.918489 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18\": container with ID starting with a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18 not found: ID does not exist" containerID="a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.918516 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18"} err="failed to get container status \"a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18\": rpc error: code = NotFound desc = could not find container \"a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18\": container with ID starting with a46b797804394bf994a321a92fd895cebe364889717cf9cc2148e30fbc025d18 not found: ID does not exist" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.918545 4669 scope.go:117] "RemoveContainer" containerID="4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db" Oct 08 21:41:39 crc kubenswrapper[4669]: E1008 21:41:39.918760 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db\": container with ID starting with 4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db not found: ID does not exist" containerID="4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db" Oct 08 21:41:39 crc kubenswrapper[4669]: I1008 21:41:39.918782 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db"} err="failed to get container status \"4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db\": rpc error: code = NotFound desc = could not find container \"4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db\": container with ID starting with 4617d82fcfa6650efc8c910355ed1a2a782b3ed687918556cda741d2e69e32db not found: ID does not exist" Oct 08 21:41:41 crc kubenswrapper[4669]: I1008 21:41:41.347304 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" path="/var/lib/kubelet/pods/f3805932-16ff-49c5-b43a-ee24a36da57b/volumes" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.355196 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-96vxr"] Oct 08 21:41:43 crc kubenswrapper[4669]: E1008 21:41:43.355893 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="registry-server" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.355912 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="registry-server" Oct 08 21:41:43 crc kubenswrapper[4669]: E1008 21:41:43.355932 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="extract-content" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.355943 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="extract-content" Oct 08 21:41:43 crc kubenswrapper[4669]: E1008 21:41:43.355968 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="extract-utilities" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.355977 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="extract-utilities" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.356236 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3805932-16ff-49c5-b43a-ee24a36da57b" containerName="registry-server" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.357880 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96vxr"] Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.357987 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.558302 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-catalog-content\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.559040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-utilities\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.559168 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhscr\" (UniqueName: \"kubernetes.io/projected/402defc1-2b91-4a90-bb4c-7b03eb3e8236-kube-api-access-dhscr\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.661066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhscr\" (UniqueName: \"kubernetes.io/projected/402defc1-2b91-4a90-bb4c-7b03eb3e8236-kube-api-access-dhscr\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.661213 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-catalog-content\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.661250 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-utilities\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.661964 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-utilities\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.662037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-catalog-content\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.683184 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhscr\" (UniqueName: \"kubernetes.io/projected/402defc1-2b91-4a90-bb4c-7b03eb3e8236-kube-api-access-dhscr\") pod \"community-operators-96vxr\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:43 crc kubenswrapper[4669]: I1008 21:41:43.702413 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:44 crc kubenswrapper[4669]: I1008 21:41:44.194414 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96vxr"] Oct 08 21:41:44 crc kubenswrapper[4669]: I1008 21:41:44.887307 4669 generic.go:334] "Generic (PLEG): container finished" podID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerID="6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0" exitCode=0 Oct 08 21:41:44 crc kubenswrapper[4669]: I1008 21:41:44.887356 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerDied","Data":"6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0"} Oct 08 21:41:44 crc kubenswrapper[4669]: I1008 21:41:44.887388 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerStarted","Data":"040f44592354b54bb6250633468e46f2727791481113a83441e538a4f4193cd3"} Oct 08 21:41:45 crc kubenswrapper[4669]: I1008 21:41:45.898123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerStarted","Data":"0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222"} Oct 08 21:41:46 crc kubenswrapper[4669]: I1008 21:41:46.918006 4669 generic.go:334] "Generic (PLEG): container finished" podID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerID="0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222" exitCode=0 Oct 08 21:41:46 crc kubenswrapper[4669]: I1008 21:41:46.918070 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerDied","Data":"0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222"} Oct 08 21:41:47 crc kubenswrapper[4669]: I1008 21:41:47.930291 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerStarted","Data":"ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0"} Oct 08 21:41:47 crc kubenswrapper[4669]: I1008 21:41:47.953228 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-96vxr" podStartSLOduration=2.338829052 podStartE2EDuration="4.953207657s" podCreationTimestamp="2025-10-08 21:41:43 +0000 UTC" firstStartedPulling="2025-10-08 21:41:44.890315601 +0000 UTC m=+3424.583126284" lastFinishedPulling="2025-10-08 21:41:47.504694216 +0000 UTC m=+3427.197504889" observedRunningTime="2025-10-08 21:41:47.946060416 +0000 UTC m=+3427.638871129" watchObservedRunningTime="2025-10-08 21:41:47.953207657 +0000 UTC m=+3427.646018330" Oct 08 21:41:53 crc kubenswrapper[4669]: I1008 21:41:53.703420 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:53 crc kubenswrapper[4669]: I1008 21:41:53.704115 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:53 crc kubenswrapper[4669]: I1008 21:41:53.774321 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:54 crc kubenswrapper[4669]: I1008 21:41:54.054002 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:54 crc kubenswrapper[4669]: I1008 21:41:54.094420 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96vxr"] Oct 08 21:41:56 crc kubenswrapper[4669]: I1008 21:41:56.031784 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-96vxr" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="registry-server" containerID="cri-o://ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0" gracePeriod=2 Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.045756 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.049422 4669 generic.go:334] "Generic (PLEG): container finished" podID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerID="ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0" exitCode=0 Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.049476 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerDied","Data":"ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0"} Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.049506 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96vxr" event={"ID":"402defc1-2b91-4a90-bb4c-7b03eb3e8236","Type":"ContainerDied","Data":"040f44592354b54bb6250633468e46f2727791481113a83441e538a4f4193cd3"} Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.049547 4669 scope.go:117] "RemoveContainer" containerID="ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.070508 4669 scope.go:117] "RemoveContainer" containerID="0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.125849 4669 scope.go:117] "RemoveContainer" containerID="6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.138906 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-utilities\") pod \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.140127 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-catalog-content\") pod \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.140274 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhscr\" (UniqueName: \"kubernetes.io/projected/402defc1-2b91-4a90-bb4c-7b03eb3e8236-kube-api-access-dhscr\") pod \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\" (UID: \"402defc1-2b91-4a90-bb4c-7b03eb3e8236\") " Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.141190 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-utilities" (OuterVolumeSpecName: "utilities") pod "402defc1-2b91-4a90-bb4c-7b03eb3e8236" (UID: "402defc1-2b91-4a90-bb4c-7b03eb3e8236"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.141362 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.146416 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402defc1-2b91-4a90-bb4c-7b03eb3e8236-kube-api-access-dhscr" (OuterVolumeSpecName: "kube-api-access-dhscr") pod "402defc1-2b91-4a90-bb4c-7b03eb3e8236" (UID: "402defc1-2b91-4a90-bb4c-7b03eb3e8236"). InnerVolumeSpecName "kube-api-access-dhscr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.195419 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "402defc1-2b91-4a90-bb4c-7b03eb3e8236" (UID: "402defc1-2b91-4a90-bb4c-7b03eb3e8236"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.217732 4669 scope.go:117] "RemoveContainer" containerID="ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0" Oct 08 21:41:57 crc kubenswrapper[4669]: E1008 21:41:57.218134 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0\": container with ID starting with ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0 not found: ID does not exist" containerID="ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.218166 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0"} err="failed to get container status \"ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0\": rpc error: code = NotFound desc = could not find container \"ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0\": container with ID starting with ed6a9ec801a27ffe6aa97f2efd0f3bf3a27091416bcf51e53c95a093307bf8c0 not found: ID does not exist" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.218188 4669 scope.go:117] "RemoveContainer" containerID="0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222" Oct 08 21:41:57 crc kubenswrapper[4669]: E1008 21:41:57.218517 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222\": container with ID starting with 0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222 not found: ID does not exist" containerID="0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.218563 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222"} err="failed to get container status \"0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222\": rpc error: code = NotFound desc = could not find container \"0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222\": container with ID starting with 0b51b54b45e1bc7bdd940ae0c34f767d0a7a30f982e0944512e20ffd86938222 not found: ID does not exist" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.218578 4669 scope.go:117] "RemoveContainer" containerID="6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0" Oct 08 21:41:57 crc kubenswrapper[4669]: E1008 21:41:57.218817 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0\": container with ID starting with 6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0 not found: ID does not exist" containerID="6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.218837 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0"} err="failed to get container status \"6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0\": rpc error: code = NotFound desc = could not find container \"6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0\": container with ID starting with 6d6dd72a239a50a8a43e94d864b7bbae28de95599474eb2e1c786c36dbbc77c0 not found: ID does not exist" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.242029 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402defc1-2b91-4a90-bb4c-7b03eb3e8236-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:41:57 crc kubenswrapper[4669]: I1008 21:41:57.242058 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhscr\" (UniqueName: \"kubernetes.io/projected/402defc1-2b91-4a90-bb4c-7b03eb3e8236-kube-api-access-dhscr\") on node \"crc\" DevicePath \"\"" Oct 08 21:41:58 crc kubenswrapper[4669]: I1008 21:41:58.058610 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96vxr" Oct 08 21:41:58 crc kubenswrapper[4669]: I1008 21:41:58.090611 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96vxr"] Oct 08 21:41:58 crc kubenswrapper[4669]: I1008 21:41:58.099861 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-96vxr"] Oct 08 21:41:59 crc kubenswrapper[4669]: I1008 21:41:59.350260 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" path="/var/lib/kubelet/pods/402defc1-2b91-4a90-bb4c-7b03eb3e8236/volumes" Oct 08 21:42:19 crc kubenswrapper[4669]: I1008 21:42:19.260982 4669 generic.go:334] "Generic (PLEG): container finished" podID="ad5f7082-536e-477e-a8a3-b5c4945b3b87" containerID="c52152722adfd5b12164a01e39f1796d9e063607aa931b5573ab7efa38e61396" exitCode=0 Oct 08 21:42:19 crc kubenswrapper[4669]: I1008 21:42:19.261112 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad5f7082-536e-477e-a8a3-b5c4945b3b87","Type":"ContainerDied","Data":"c52152722adfd5b12164a01e39f1796d9e063607aa931b5573ab7efa38e61396"} Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.734696 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.931640 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-config-data\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932047 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp7gr\" (UniqueName: \"kubernetes.io/projected/ad5f7082-536e-477e-a8a3-b5c4945b3b87-kube-api-access-kp7gr\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932076 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-temporary\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932109 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-workdir\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932141 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config-secret\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932170 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932233 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ca-certs\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932297 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932353 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ssh-key\") pod \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\" (UID: \"ad5f7082-536e-477e-a8a3-b5c4945b3b87\") " Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.932760 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.933061 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-config-data" (OuterVolumeSpecName: "config-data") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.937729 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5f7082-536e-477e-a8a3-b5c4945b3b87-kube-api-access-kp7gr" (OuterVolumeSpecName: "kube-api-access-kp7gr") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "kube-api-access-kp7gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.938613 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.939154 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.960365 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.962683 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.963029 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:42:20 crc kubenswrapper[4669]: I1008 21:42:20.993082 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ad5f7082-536e-477e-a8a3-b5c4945b3b87" (UID: "ad5f7082-536e-477e-a8a3-b5c4945b3b87"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035232 4669 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035268 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035284 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035296 4669 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035329 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035339 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5f7082-536e-477e-a8a3-b5c4945b3b87-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035348 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5f7082-536e-477e-a8a3-b5c4945b3b87-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035358 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp7gr\" (UniqueName: \"kubernetes.io/projected/ad5f7082-536e-477e-a8a3-b5c4945b3b87-kube-api-access-kp7gr\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.035367 4669 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad5f7082-536e-477e-a8a3-b5c4945b3b87-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.055383 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.137809 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.282939 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad5f7082-536e-477e-a8a3-b5c4945b3b87","Type":"ContainerDied","Data":"943e2bf7a282d4a702eb7af90a696d2278dc5dd951a724a48c2b1932ca722fd8"} Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.282979 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943e2bf7a282d4a702eb7af90a696d2278dc5dd951a724a48c2b1932ca722fd8" Oct 08 21:42:21 crc kubenswrapper[4669]: I1008 21:42:21.283364 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.162853 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 08 21:42:26 crc kubenswrapper[4669]: E1008 21:42:26.163787 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5f7082-536e-477e-a8a3-b5c4945b3b87" containerName="tempest-tests-tempest-tests-runner" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.163799 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5f7082-536e-477e-a8a3-b5c4945b3b87" containerName="tempest-tests-tempest-tests-runner" Oct 08 21:42:26 crc kubenswrapper[4669]: E1008 21:42:26.163829 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="extract-utilities" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.163836 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="extract-utilities" Oct 08 21:42:26 crc kubenswrapper[4669]: E1008 21:42:26.163862 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="registry-server" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.163868 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="registry-server" Oct 08 21:42:26 crc kubenswrapper[4669]: E1008 21:42:26.163885 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="extract-content" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.163890 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="extract-content" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.164065 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="402defc1-2b91-4a90-bb4c-7b03eb3e8236" containerName="registry-server" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.164084 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5f7082-536e-477e-a8a3-b5c4945b3b87" containerName="tempest-tests-tempest-tests-runner" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.164761 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.167477 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-crlhk" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.183587 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.329824 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8sl\" (UniqueName: \"kubernetes.io/projected/919e000b-619f-4e18-b6f6-4473d23718c9-kube-api-access-xz8sl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.330054 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.432037 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.432219 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8sl\" (UniqueName: \"kubernetes.io/projected/919e000b-619f-4e18-b6f6-4473d23718c9-kube-api-access-xz8sl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.432759 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.460771 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8sl\" (UniqueName: \"kubernetes.io/projected/919e000b-619f-4e18-b6f6-4473d23718c9-kube-api-access-xz8sl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.461449 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"919e000b-619f-4e18-b6f6-4473d23718c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:26 crc kubenswrapper[4669]: I1008 21:42:26.494842 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 21:42:27 crc kubenswrapper[4669]: I1008 21:42:27.162611 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 08 21:42:27 crc kubenswrapper[4669]: I1008 21:42:27.359332 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"919e000b-619f-4e18-b6f6-4473d23718c9","Type":"ContainerStarted","Data":"fac5b18328efdd621eac005f61bc03d830d0e8211da1ad1bed9f6892106ed4ac"} Oct 08 21:42:28 crc kubenswrapper[4669]: I1008 21:42:28.380757 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"919e000b-619f-4e18-b6f6-4473d23718c9","Type":"ContainerStarted","Data":"423ed7f9ea6e377c0b9b483a1f4d360e0af865062677be47ba2fea6ae8492eb9"} Oct 08 21:42:28 crc kubenswrapper[4669]: I1008 21:42:28.405419 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.453688782 podStartE2EDuration="2.405400743s" podCreationTimestamp="2025-10-08 21:42:26 +0000 UTC" firstStartedPulling="2025-10-08 21:42:27.169188748 +0000 UTC m=+3466.861999431" lastFinishedPulling="2025-10-08 21:42:28.120900699 +0000 UTC m=+3467.813711392" observedRunningTime="2025-10-08 21:42:28.397778259 +0000 UTC m=+3468.090588942" watchObservedRunningTime="2025-10-08 21:42:28.405400743 +0000 UTC m=+3468.098211416" Oct 08 21:42:43 crc kubenswrapper[4669]: I1008 21:42:43.184937 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:42:43 crc kubenswrapper[4669]: I1008 21:42:43.185472 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.157781 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pmrzj/must-gather-jfxtf"] Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.159703 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.162497 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pmrzj"/"openshift-service-ca.crt" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.162552 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pmrzj"/"default-dockercfg-ns9t4" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.163438 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pmrzj"/"kube-root-ca.crt" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.169303 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pmrzj/must-gather-jfxtf"] Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.317134 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/345c331b-742a-4722-9868-d2cc6cf5e7cb-must-gather-output\") pod \"must-gather-jfxtf\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.317484 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ghdd\" (UniqueName: \"kubernetes.io/projected/345c331b-742a-4722-9868-d2cc6cf5e7cb-kube-api-access-2ghdd\") pod \"must-gather-jfxtf\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.420469 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/345c331b-742a-4722-9868-d2cc6cf5e7cb-must-gather-output\") pod \"must-gather-jfxtf\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.420613 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ghdd\" (UniqueName: \"kubernetes.io/projected/345c331b-742a-4722-9868-d2cc6cf5e7cb-kube-api-access-2ghdd\") pod \"must-gather-jfxtf\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.421456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/345c331b-742a-4722-9868-d2cc6cf5e7cb-must-gather-output\") pod \"must-gather-jfxtf\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.448468 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ghdd\" (UniqueName: \"kubernetes.io/projected/345c331b-742a-4722-9868-d2cc6cf5e7cb-kube-api-access-2ghdd\") pod \"must-gather-jfxtf\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.482203 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.947412 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pmrzj/must-gather-jfxtf"] Oct 08 21:42:45 crc kubenswrapper[4669]: I1008 21:42:45.951430 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:42:46 crc kubenswrapper[4669]: I1008 21:42:46.582744 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" event={"ID":"345c331b-742a-4722-9868-d2cc6cf5e7cb","Type":"ContainerStarted","Data":"a1bb61d6cf97c130fdbf0db7270f916b80e99f06bd17c8150f2b005b1aee7907"} Oct 08 21:42:50 crc kubenswrapper[4669]: I1008 21:42:50.618959 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" event={"ID":"345c331b-742a-4722-9868-d2cc6cf5e7cb","Type":"ContainerStarted","Data":"2211d8810c8b6fe7d02036d6ee73e0f9792ee4caa0dd63d183dc3de302d28f25"} Oct 08 21:42:50 crc kubenswrapper[4669]: I1008 21:42:50.619767 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" event={"ID":"345c331b-742a-4722-9868-d2cc6cf5e7cb","Type":"ContainerStarted","Data":"595c2359c028040ae502e802137f1eed665d80bb2cb0b7d64cbdeb6d524c0860"} Oct 08 21:42:50 crc kubenswrapper[4669]: I1008 21:42:50.640300 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" podStartSLOduration=2.011834348 podStartE2EDuration="5.640281568s" podCreationTimestamp="2025-10-08 21:42:45 +0000 UTC" firstStartedPulling="2025-10-08 21:42:45.950965436 +0000 UTC m=+3485.643776109" lastFinishedPulling="2025-10-08 21:42:49.579412656 +0000 UTC m=+3489.272223329" observedRunningTime="2025-10-08 21:42:50.63328851 +0000 UTC m=+3490.326099223" watchObservedRunningTime="2025-10-08 21:42:50.640281568 +0000 UTC m=+3490.333092241" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.435787 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-qs64n"] Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.438160 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.577041 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60300f81-bfdb-4b02-bf2c-52064e8f633b-host\") pod \"crc-debug-qs64n\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.577437 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknnv\" (UniqueName: \"kubernetes.io/projected/60300f81-bfdb-4b02-bf2c-52064e8f633b-kube-api-access-kknnv\") pod \"crc-debug-qs64n\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.678690 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kknnv\" (UniqueName: \"kubernetes.io/projected/60300f81-bfdb-4b02-bf2c-52064e8f633b-kube-api-access-kknnv\") pod \"crc-debug-qs64n\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.678848 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60300f81-bfdb-4b02-bf2c-52064e8f633b-host\") pod \"crc-debug-qs64n\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.679014 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60300f81-bfdb-4b02-bf2c-52064e8f633b-host\") pod \"crc-debug-qs64n\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.706426 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknnv\" (UniqueName: \"kubernetes.io/projected/60300f81-bfdb-4b02-bf2c-52064e8f633b-kube-api-access-kknnv\") pod \"crc-debug-qs64n\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: I1008 21:42:53.755703 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:42:53 crc kubenswrapper[4669]: W1008 21:42:53.798508 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60300f81_bfdb_4b02_bf2c_52064e8f633b.slice/crio-20cad7ff088640bdf2a194908f28ee057ae772ab8c8d1781a4682082a0c6aeaa WatchSource:0}: Error finding container 20cad7ff088640bdf2a194908f28ee057ae772ab8c8d1781a4682082a0c6aeaa: Status 404 returned error can't find the container with id 20cad7ff088640bdf2a194908f28ee057ae772ab8c8d1781a4682082a0c6aeaa Oct 08 21:42:54 crc kubenswrapper[4669]: I1008 21:42:54.655045 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" event={"ID":"60300f81-bfdb-4b02-bf2c-52064e8f633b","Type":"ContainerStarted","Data":"20cad7ff088640bdf2a194908f28ee057ae772ab8c8d1781a4682082a0c6aeaa"} Oct 08 21:43:04 crc kubenswrapper[4669]: I1008 21:43:04.749306 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" event={"ID":"60300f81-bfdb-4b02-bf2c-52064e8f633b","Type":"ContainerStarted","Data":"477941b0d8261389ff81f0e0d66c2bf18ae4a7d587717345cec54ea4c66277b5"} Oct 08 21:43:04 crc kubenswrapper[4669]: I1008 21:43:04.763583 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" podStartSLOduration=1.614449827 podStartE2EDuration="11.763566881s" podCreationTimestamp="2025-10-08 21:42:53 +0000 UTC" firstStartedPulling="2025-10-08 21:42:53.80237341 +0000 UTC m=+3493.495184073" lastFinishedPulling="2025-10-08 21:43:03.951490454 +0000 UTC m=+3503.644301127" observedRunningTime="2025-10-08 21:43:04.760796053 +0000 UTC m=+3504.453606736" watchObservedRunningTime="2025-10-08 21:43:04.763566881 +0000 UTC m=+3504.456377564" Oct 08 21:43:13 crc kubenswrapper[4669]: I1008 21:43:13.185394 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:43:13 crc kubenswrapper[4669]: I1008 21:43:13.185945 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:43:42 crc kubenswrapper[4669]: I1008 21:43:42.093316 4669 generic.go:334] "Generic (PLEG): container finished" podID="60300f81-bfdb-4b02-bf2c-52064e8f633b" containerID="477941b0d8261389ff81f0e0d66c2bf18ae4a7d587717345cec54ea4c66277b5" exitCode=0 Oct 08 21:43:42 crc kubenswrapper[4669]: I1008 21:43:42.093783 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" event={"ID":"60300f81-bfdb-4b02-bf2c-52064e8f633b","Type":"ContainerDied","Data":"477941b0d8261389ff81f0e0d66c2bf18ae4a7d587717345cec54ea4c66277b5"} Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.185152 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.185567 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.185631 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.186524 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.186626 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" gracePeriod=600 Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.199572 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.257909 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-qs64n"] Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.272269 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-qs64n"] Oct 08 21:43:43 crc kubenswrapper[4669]: E1008 21:43:43.309530 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.333992 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60300f81-bfdb-4b02-bf2c-52064e8f633b-host\") pod \"60300f81-bfdb-4b02-bf2c-52064e8f633b\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.334085 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kknnv\" (UniqueName: \"kubernetes.io/projected/60300f81-bfdb-4b02-bf2c-52064e8f633b-kube-api-access-kknnv\") pod \"60300f81-bfdb-4b02-bf2c-52064e8f633b\" (UID: \"60300f81-bfdb-4b02-bf2c-52064e8f633b\") " Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.334136 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60300f81-bfdb-4b02-bf2c-52064e8f633b-host" (OuterVolumeSpecName: "host") pod "60300f81-bfdb-4b02-bf2c-52064e8f633b" (UID: "60300f81-bfdb-4b02-bf2c-52064e8f633b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.335060 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60300f81-bfdb-4b02-bf2c-52064e8f633b-host\") on node \"crc\" DevicePath \"\"" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.341808 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60300f81-bfdb-4b02-bf2c-52064e8f633b-kube-api-access-kknnv" (OuterVolumeSpecName: "kube-api-access-kknnv") pod "60300f81-bfdb-4b02-bf2c-52064e8f633b" (UID: "60300f81-bfdb-4b02-bf2c-52064e8f633b"). InnerVolumeSpecName "kube-api-access-kknnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.343220 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60300f81-bfdb-4b02-bf2c-52064e8f633b" path="/var/lib/kubelet/pods/60300f81-bfdb-4b02-bf2c-52064e8f633b/volumes" Oct 08 21:43:43 crc kubenswrapper[4669]: I1008 21:43:43.436258 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kknnv\" (UniqueName: \"kubernetes.io/projected/60300f81-bfdb-4b02-bf2c-52064e8f633b-kube-api-access-kknnv\") on node \"crc\" DevicePath \"\"" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.119383 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" exitCode=0 Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.119444 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286"} Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.119768 4669 scope.go:117] "RemoveContainer" containerID="04b13f640358f5e7909dc5cd3cf3c01285108fb23136ef56f11945221a6b0387" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.120451 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:43:44 crc kubenswrapper[4669]: E1008 21:43:44.120773 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.122828 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-qs64n" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.157978 4669 scope.go:117] "RemoveContainer" containerID="477941b0d8261389ff81f0e0d66c2bf18ae4a7d587717345cec54ea4c66277b5" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.437598 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-b8bh2"] Oct 08 21:43:44 crc kubenswrapper[4669]: E1008 21:43:44.438887 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60300f81-bfdb-4b02-bf2c-52064e8f633b" containerName="container-00" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.438981 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="60300f81-bfdb-4b02-bf2c-52064e8f633b" containerName="container-00" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.439229 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="60300f81-bfdb-4b02-bf2c-52064e8f633b" containerName="container-00" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.439871 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.554606 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-host\") pod \"crc-debug-b8bh2\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.554683 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c429f\" (UniqueName: \"kubernetes.io/projected/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-kube-api-access-c429f\") pod \"crc-debug-b8bh2\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.656787 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-host\") pod \"crc-debug-b8bh2\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.657118 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c429f\" (UniqueName: \"kubernetes.io/projected/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-kube-api-access-c429f\") pod \"crc-debug-b8bh2\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.656932 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-host\") pod \"crc-debug-b8bh2\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.676870 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c429f\" (UniqueName: \"kubernetes.io/projected/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-kube-api-access-c429f\") pod \"crc-debug-b8bh2\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: I1008 21:43:44.757277 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:44 crc kubenswrapper[4669]: W1008 21:43:44.786623 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc96ee9c4_b03d_4361_8c4d_ddc70aa3f179.slice/crio-23f053a3d9ce89f3f8ba7688e7c996146a3990658348cd04bb71f85866fb7840 WatchSource:0}: Error finding container 23f053a3d9ce89f3f8ba7688e7c996146a3990658348cd04bb71f85866fb7840: Status 404 returned error can't find the container with id 23f053a3d9ce89f3f8ba7688e7c996146a3990658348cd04bb71f85866fb7840 Oct 08 21:43:45 crc kubenswrapper[4669]: I1008 21:43:45.136690 4669 generic.go:334] "Generic (PLEG): container finished" podID="c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" containerID="459727e7cd3e7eac5fcf8bd1cb7d104c40abe70929c3a3336ea4dfb6c377b195" exitCode=0 Oct 08 21:43:45 crc kubenswrapper[4669]: I1008 21:43:45.136742 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" event={"ID":"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179","Type":"ContainerDied","Data":"459727e7cd3e7eac5fcf8bd1cb7d104c40abe70929c3a3336ea4dfb6c377b195"} Oct 08 21:43:45 crc kubenswrapper[4669]: I1008 21:43:45.137179 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" event={"ID":"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179","Type":"ContainerStarted","Data":"23f053a3d9ce89f3f8ba7688e7c996146a3990658348cd04bb71f85866fb7840"} Oct 08 21:43:45 crc kubenswrapper[4669]: I1008 21:43:45.660481 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-b8bh2"] Oct 08 21:43:45 crc kubenswrapper[4669]: I1008 21:43:45.668709 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-b8bh2"] Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.277575 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.387967 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-host\") pod \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.388091 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-host" (OuterVolumeSpecName: "host") pod "c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" (UID: "c96ee9c4-b03d-4361-8c4d-ddc70aa3f179"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.388495 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c429f\" (UniqueName: \"kubernetes.io/projected/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-kube-api-access-c429f\") pod \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\" (UID: \"c96ee9c4-b03d-4361-8c4d-ddc70aa3f179\") " Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.389999 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-host\") on node \"crc\" DevicePath \"\"" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.394627 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-kube-api-access-c429f" (OuterVolumeSpecName: "kube-api-access-c429f") pod "c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" (UID: "c96ee9c4-b03d-4361-8c4d-ddc70aa3f179"). InnerVolumeSpecName "kube-api-access-c429f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.492813 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c429f\" (UniqueName: \"kubernetes.io/projected/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179-kube-api-access-c429f\") on node \"crc\" DevicePath \"\"" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.885961 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-dp8wd"] Oct 08 21:43:46 crc kubenswrapper[4669]: E1008 21:43:46.886403 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" containerName="container-00" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.886416 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" containerName="container-00" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.886661 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" containerName="container-00" Oct 08 21:43:46 crc kubenswrapper[4669]: I1008 21:43:46.887763 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.001124 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jgj\" (UniqueName: \"kubernetes.io/projected/3eecb272-61d7-4c17-940b-aa95f6c37dff-kube-api-access-n5jgj\") pod \"crc-debug-dp8wd\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.001179 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3eecb272-61d7-4c17-940b-aa95f6c37dff-host\") pod \"crc-debug-dp8wd\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.103901 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jgj\" (UniqueName: \"kubernetes.io/projected/3eecb272-61d7-4c17-940b-aa95f6c37dff-kube-api-access-n5jgj\") pod \"crc-debug-dp8wd\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.103959 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3eecb272-61d7-4c17-940b-aa95f6c37dff-host\") pod \"crc-debug-dp8wd\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.104076 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3eecb272-61d7-4c17-940b-aa95f6c37dff-host\") pod \"crc-debug-dp8wd\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.130303 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jgj\" (UniqueName: \"kubernetes.io/projected/3eecb272-61d7-4c17-940b-aa95f6c37dff-kube-api-access-n5jgj\") pod \"crc-debug-dp8wd\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.196862 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f053a3d9ce89f3f8ba7688e7c996146a3990658348cd04bb71f85866fb7840" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.196946 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-b8bh2" Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.215880 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:47 crc kubenswrapper[4669]: W1008 21:43:47.255608 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eecb272_61d7_4c17_940b_aa95f6c37dff.slice/crio-02fb6d34538adf075c0dd07533cd978122ec01256a462f420aaee573c3465590 WatchSource:0}: Error finding container 02fb6d34538adf075c0dd07533cd978122ec01256a462f420aaee573c3465590: Status 404 returned error can't find the container with id 02fb6d34538adf075c0dd07533cd978122ec01256a462f420aaee573c3465590 Oct 08 21:43:47 crc kubenswrapper[4669]: I1008 21:43:47.344394 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96ee9c4-b03d-4361-8c4d-ddc70aa3f179" path="/var/lib/kubelet/pods/c96ee9c4-b03d-4361-8c4d-ddc70aa3f179/volumes" Oct 08 21:43:48 crc kubenswrapper[4669]: I1008 21:43:48.206970 4669 generic.go:334] "Generic (PLEG): container finished" podID="3eecb272-61d7-4c17-940b-aa95f6c37dff" containerID="eb0de8e9869973ffacf3cd1e17baa5c0972134b8aa625c981717b4cb3d1597ce" exitCode=0 Oct 08 21:43:48 crc kubenswrapper[4669]: I1008 21:43:48.207093 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" event={"ID":"3eecb272-61d7-4c17-940b-aa95f6c37dff","Type":"ContainerDied","Data":"eb0de8e9869973ffacf3cd1e17baa5c0972134b8aa625c981717b4cb3d1597ce"} Oct 08 21:43:48 crc kubenswrapper[4669]: I1008 21:43:48.207304 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" event={"ID":"3eecb272-61d7-4c17-940b-aa95f6c37dff","Type":"ContainerStarted","Data":"02fb6d34538adf075c0dd07533cd978122ec01256a462f420aaee573c3465590"} Oct 08 21:43:48 crc kubenswrapper[4669]: I1008 21:43:48.300965 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-dp8wd"] Oct 08 21:43:48 crc kubenswrapper[4669]: I1008 21:43:48.307350 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pmrzj/crc-debug-dp8wd"] Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.312308 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.376017 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbf49878d-2r5dt_6311b31e-a85f-4bc0-9c1a-254c1650ef17/barbican-api/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.459035 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5jgj\" (UniqueName: \"kubernetes.io/projected/3eecb272-61d7-4c17-940b-aa95f6c37dff-kube-api-access-n5jgj\") pod \"3eecb272-61d7-4c17-940b-aa95f6c37dff\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.459114 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3eecb272-61d7-4c17-940b-aa95f6c37dff-host\") pod \"3eecb272-61d7-4c17-940b-aa95f6c37dff\" (UID: \"3eecb272-61d7-4c17-940b-aa95f6c37dff\") " Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.459236 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eecb272-61d7-4c17-940b-aa95f6c37dff-host" (OuterVolumeSpecName: "host") pod "3eecb272-61d7-4c17-940b-aa95f6c37dff" (UID: "3eecb272-61d7-4c17-940b-aa95f6c37dff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.459567 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3eecb272-61d7-4c17-940b-aa95f6c37dff-host\") on node \"crc\" DevicePath \"\"" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.471175 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eecb272-61d7-4c17-940b-aa95f6c37dff-kube-api-access-n5jgj" (OuterVolumeSpecName: "kube-api-access-n5jgj") pod "3eecb272-61d7-4c17-940b-aa95f6c37dff" (UID: "3eecb272-61d7-4c17-940b-aa95f6c37dff"). InnerVolumeSpecName "kube-api-access-n5jgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.547686 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69d49dbff8-2tq27_b1994a50-3452-488f-b364-1b1377cfd62d/barbican-keystone-listener/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.561577 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5jgj\" (UniqueName: \"kubernetes.io/projected/3eecb272-61d7-4c17-940b-aa95f6c37dff-kube-api-access-n5jgj\") on node \"crc\" DevicePath \"\"" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.563694 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbf49878d-2r5dt_6311b31e-a85f-4bc0-9c1a-254c1650ef17/barbican-api-log/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.637505 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69d49dbff8-2tq27_b1994a50-3452-488f-b364-1b1377cfd62d/barbican-keystone-listener-log/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.743784 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8596cf69cc-5lmdk_ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c/barbican-worker/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.778123 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8596cf69cc-5lmdk_ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c/barbican-worker-log/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.955203 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr_d54b9af7-032e-4b63-ada5-0cebab9e052d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:49 crc kubenswrapper[4669]: I1008 21:43:49.986153 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/ceilometer-central-agent/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.104986 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/ceilometer-notification-agent/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.161244 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/sg-core/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.193182 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/proxy-httpd/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.222327 4669 scope.go:117] "RemoveContainer" containerID="eb0de8e9869973ffacf3cd1e17baa5c0972134b8aa625c981717b4cb3d1597ce" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.222382 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/crc-debug-dp8wd" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.361151 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_42082ba5-0485-486f-8b1c-17cf7c0fc405/cinder-api/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.387673 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_42082ba5-0485-486f-8b1c-17cf7c0fc405/cinder-api-log/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.462238 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3873e946-b682-46b0-9b31-c34217bed686/cinder-scheduler/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.594739 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3873e946-b682-46b0-9b31-c34217bed686/probe/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.655007 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx_e84616ce-4d73-4f8f-85b3-cca04e509792/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.777171 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6xbff_1d09eff3-6572-462c-91db-4a1f5f167eae/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.835138 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-tgph4_e78c98eb-ee71-4877-92f0-edfa9eaca8e5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:50 crc kubenswrapper[4669]: I1008 21:43:50.968954 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-czgxg_2626e058-7115-4198-91ab-19e6f98dfc89/init/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.149260 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-czgxg_2626e058-7115-4198-91ab-19e6f98dfc89/init/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.155292 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-czgxg_2626e058-7115-4198-91ab-19e6f98dfc89/dnsmasq-dns/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.189553 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw_8f08a947-ff60-4018-80f6-0098a257eddf/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.333049 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e/glance-httpd/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.341060 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eecb272-61d7-4c17-940b-aa95f6c37dff" path="/var/lib/kubelet/pods/3eecb272-61d7-4c17-940b-aa95f6c37dff/volumes" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.348452 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e/glance-log/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.658694 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7c63c33c-d91e-49b9-8b85-50960824149b/glance-httpd/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.687595 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7c63c33c-d91e-49b9-8b85-50960824149b/glance-log/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.851855 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fc57f5668-z5dzm_43e0f642-1a58-481b-8347-b4d29176ddc5/horizon/0.log" Oct 08 21:43:51 crc kubenswrapper[4669]: I1008 21:43:51.970312 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-72m9r_296bfc36-ce85-4db3-a692-acf7edf869b1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:52 crc kubenswrapper[4669]: I1008 21:43:52.173543 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hk4fz_30bc939f-3290-4c46-8d00-120c0bf33951/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:52 crc kubenswrapper[4669]: I1008 21:43:52.218203 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fc57f5668-z5dzm_43e0f642-1a58-481b-8347-b4d29176ddc5/horizon-log/0.log" Oct 08 21:43:52 crc kubenswrapper[4669]: I1008 21:43:52.452749 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8af63aa0-e5df-488c-b5e4-4677c9d0f2de/kube-state-metrics/0.log" Oct 08 21:43:52 crc kubenswrapper[4669]: I1008 21:43:52.477219 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6877c668b7-ws6lk_d35cc358-f26a-4e29-a61e-cf7e82c331a7/keystone-api/0.log" Oct 08 21:43:52 crc kubenswrapper[4669]: I1008 21:43:52.645920 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2_f0494a3d-36c9-4d26-8f15-c1780af52f46/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.038361 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg_2c0c5d80-cf44-45bb-847d-839dc3fd8887/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.086661 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f968c55b5-frgnz_1e273869-5ed1-48c3-af8f-d4d2df61c9e7/neutron-httpd/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.097218 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f968c55b5-frgnz_1e273869-5ed1-48c3-af8f-d4d2df61c9e7/neutron-api/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.668498 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26677aac-fbac-4ec9-972c-e22c276549f2/nova-api-log/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.728990 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eb3e439c-741b-4d40-a85e-1f6da93a485c/nova-cell0-conductor-conductor/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.874245 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7c4f1ade-7c5a-45f6-8b72-33a192186209/nova-cell1-conductor-conductor/0.log" Oct 08 21:43:53 crc kubenswrapper[4669]: I1008 21:43:53.984629 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26677aac-fbac-4ec9-972c-e22c276549f2/nova-api-api/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.017838 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_915f256e-a280-4726-8987-df1df9f8e4b5/nova-cell1-novncproxy-novncproxy/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.137982 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-tj7fq_9500e0b9-017f-4e4c-b72c-cbe0c98f7660/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.411520 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f57d146-c85b-4beb-a614-1ea878e175b4/nova-metadata-log/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.672062 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_89cdc120-36f5-4203-b064-4300a8249a64/nova-scheduler-scheduler/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.687723 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_53fa562b-eb62-487e-8b82-3da0799fae19/mysql-bootstrap/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.830225 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_53fa562b-eb62-487e-8b82-3da0799fae19/mysql-bootstrap/0.log" Oct 08 21:43:54 crc kubenswrapper[4669]: I1008 21:43:54.934109 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_53fa562b-eb62-487e-8b82-3da0799fae19/galera/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.099900 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33dac706-1170-45a3-8151-a6ee9bce8005/mysql-bootstrap/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.219103 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33dac706-1170-45a3-8151-a6ee9bce8005/mysql-bootstrap/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.271679 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33dac706-1170-45a3-8151-a6ee9bce8005/galera/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.461305 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2aa3bf86-2604-4d46-bc73-13b5d049b01c/openstackclient/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.491059 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2mvd2_2e96f6f2-b5f3-49e7-8d84-15d5535963a2/ovn-controller/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.652534 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f57d146-c85b-4beb-a614-1ea878e175b4/nova-metadata-metadata/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.707948 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l4snx_2bc4b231-fe5a-4712-855f-3d56029e240b/openstack-network-exporter/0.log" Oct 08 21:43:55 crc kubenswrapper[4669]: I1008 21:43:55.864035 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovsdb-server-init/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.025784 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovsdb-server-init/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.102050 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovsdb-server/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.109041 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovs-vswitchd/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.234260 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nlnn5_eae3dd15-c997-43e0-8362-8a9210634436/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.345342 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_685f13e1-1e56-46fb-b0b4-d850050411d7/openstack-network-exporter/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.361774 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_685f13e1-1e56-46fb-b0b4-d850050411d7/ovn-northd/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.550009 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_70b1241c-d97a-4af6-9c95-dadad197012e/ovsdbserver-nb/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.561491 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_70b1241c-d97a-4af6-9c95-dadad197012e/openstack-network-exporter/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.775878 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_07bcdc9f-6970-49af-8620-63f8ed43845b/ovsdbserver-sb/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.782080 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_07bcdc9f-6970-49af-8620-63f8ed43845b/openstack-network-exporter/0.log" Oct 08 21:43:56 crc kubenswrapper[4669]: I1008 21:43:56.899947 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f47799b5d-n27h7_9076c9d7-726e-4b80-80af-a78887da72d1/placement-api/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.086654 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8e35f189-cd14-4892-a4d6-25a23a2ae04c/setup-container/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.109089 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f47799b5d-n27h7_9076c9d7-726e-4b80-80af-a78887da72d1/placement-log/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.331840 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0bfeeb02-715e-4358-802c-ce7ed6721a30/setup-container/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.348207 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8e35f189-cd14-4892-a4d6-25a23a2ae04c/setup-container/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.440125 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8e35f189-cd14-4892-a4d6-25a23a2ae04c/rabbitmq/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.630412 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0bfeeb02-715e-4358-802c-ce7ed6721a30/rabbitmq/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.635427 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0bfeeb02-715e-4358-802c-ce7ed6721a30/setup-container/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.642917 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk_ffbc24f2-90eb-4f42-b2aa-0290921dbb79/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.871451 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-b5rp4_405955bf-c08c-4afd-9720-41adf4bebd19/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:57 crc kubenswrapper[4669]: I1008 21:43:57.924121 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl_0bff2321-3d96-47bf-815e-7ab3cea9563a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.053016 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-cl595_2557fef7-913e-4188-93e7-4a60c4b4c918/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.171837 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m6tgw_f598594a-891b-4339-99d1-e10f7c3844af/ssh-known-hosts-edpm-deployment/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.336764 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:43:58 crc kubenswrapper[4669]: E1008 21:43:58.337335 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.356222 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f9445f759-bx7xs_895cb1ba-c212-4908-82e9-f5042a50686f/proxy-server/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.540970 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-58zf2_f1096bd8-53d4-4403-9abc-dd7a2c91c1e6/swift-ring-rebalance/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.553385 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f9445f759-bx7xs_895cb1ba-c212-4908-82e9-f5042a50686f/proxy-httpd/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.637498 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-auditor/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.767925 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-reaper/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.792462 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-replicator/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.840685 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-auditor/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.862686 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-server/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.965970 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-replicator/0.log" Oct 08 21:43:58 crc kubenswrapper[4669]: I1008 21:43:58.975282 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-server/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.041101 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-updater/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.108060 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-auditor/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.174421 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-expirer/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.225628 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-replicator/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.234329 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-server/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.313598 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-updater/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.387382 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/rsync/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.472837 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/swift-recon-cron/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.577240 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb_bb78e48d-ddf1-494e-883f-d9987d2f0f0a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.712001 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ad5f7082-536e-477e-a8a3-b5c4945b3b87/tempest-tests-tempest-tests-runner/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.804246 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_919e000b-619f-4e18-b6f6-4473d23718c9/test-operator-logs-container/0.log" Oct 08 21:43:59 crc kubenswrapper[4669]: I1008 21:43:59.956503 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr_d7d06c6d-e606-429a-b7e6-e6e2609b3b4e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:44:08 crc kubenswrapper[4669]: I1008 21:44:08.857135 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cb42938c-1da6-4d93-b7be-ff78d294ebf1/memcached/0.log" Oct 08 21:44:10 crc kubenswrapper[4669]: I1008 21:44:10.331216 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:44:10 crc kubenswrapper[4669]: E1008 21:44:10.331604 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.303932 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/util/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.506577 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/util/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.517516 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/pull/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.521101 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/pull/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.666972 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/util/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.682944 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/pull/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.696440 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/extract/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.880327 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rtmfb_5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5/kube-rbac-proxy/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.889149 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-6sdlr_b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45/kube-rbac-proxy/0.log" Oct 08 21:44:22 crc kubenswrapper[4669]: I1008 21:44:22.930787 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rtmfb_5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.093072 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-m9v76_5f0a0ada-acd3-452b-bd50-b5d634b906c4/kube-rbac-proxy/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.097987 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-6sdlr_b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.128559 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-m9v76_5f0a0ada-acd3-452b-bd50-b5d634b906c4/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.247959 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pjtc2_dc0d4c88-6c32-4498-8025-de3c8b59eaea/kube-rbac-proxy/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.330248 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:44:23 crc kubenswrapper[4669]: E1008 21:44:23.330592 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.330698 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pjtc2_dc0d4c88-6c32-4498-8025-de3c8b59eaea/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.458279 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-w2mfj_7185d03c-648b-488d-b1d0-842f8b72e0ff/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.461985 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-w2mfj_7185d03c-648b-488d-b1d0-842f8b72e0ff/kube-rbac-proxy/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.544115 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-mpbbj_0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a/kube-rbac-proxy/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.646311 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-mpbbj_0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.737280 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-c2w55_01d6ae8d-9f65-4a30-80fb-135e4eba5a10/kube-rbac-proxy/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.864279 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-c2w55_01d6ae8d-9f65-4a30-80fb-135e4eba5a10/manager/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.881027 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-hwmh2_f92c3530-f73f-45fe-84f5-bea451e1aaba/kube-rbac-proxy/0.log" Oct 08 21:44:23 crc kubenswrapper[4669]: I1008 21:44:23.964326 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-hwmh2_f92c3530-f73f-45fe-84f5-bea451e1aaba/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.058127 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6mn5l_b43bc083-637e-4c93-a024-a47cecaade29/kube-rbac-proxy/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.158164 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6mn5l_b43bc083-637e-4c93-a024-a47cecaade29/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.237210 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-frtrx_4ee291fa-b998-4bfc-a689-fb66e345bcaa/kube-rbac-proxy/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.262668 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-frtrx_4ee291fa-b998-4bfc-a689-fb66e345bcaa/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.353985 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-5rs85_1af3f5fc-da55-4c9e-ac87-30363f6cf741/kube-rbac-proxy/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.418706 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-5rs85_1af3f5fc-da55-4c9e-ac87-30363f6cf741/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.544719 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-vl7tc_a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4/kube-rbac-proxy/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.607688 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-vl7tc_a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.654100 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-pb4g2_5fceefe8-bf0f-4f2d-9e14-2208f38b73d7/kube-rbac-proxy/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.830878 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-pb4g2_5fceefe8-bf0f-4f2d-9e14-2208f38b73d7/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.844818 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-9vh5w_d4ee300d-b78b-4052-83a9-4ab8ca569886/manager/0.log" Oct 08 21:44:24 crc kubenswrapper[4669]: I1008 21:44:24.903990 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-9vh5w_d4ee300d-b78b-4052-83a9-4ab8ca569886/kube-rbac-proxy/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.001168 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8_d44ffb7b-c761-48b2-be7c-a5af13e2a59b/kube-rbac-proxy/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.042389 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8_d44ffb7b-c761-48b2-be7c-a5af13e2a59b/manager/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.203970 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dd9d44468-66k2c_eb2409f1-4af4-49a3-a453-29e8f447360e/kube-rbac-proxy/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.322423 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5bb56b84bf-j6scr_6f26e0f8-20e5-4f7f-8e25-c17579e57b28/kube-rbac-proxy/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.559022 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hd2hf_40ec54d4-3186-4a11-a533-b7edc48914b3/registry-server/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.648576 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5bb56b84bf-j6scr_6f26e0f8-20e5-4f7f-8e25-c17579e57b28/operator/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.778223 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-gc5qn_92d7bcb6-b09e-4605-87a9-9cdaedb40c74/kube-rbac-proxy/0.log" Oct 08 21:44:25 crc kubenswrapper[4669]: I1008 21:44:25.974723 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-gc5qn_92d7bcb6-b09e-4605-87a9-9cdaedb40c74/manager/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.051793 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-bjfjf_d4193662-5c4e-40fa-ac9e-495509e75c4a/kube-rbac-proxy/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.165210 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-bjfjf_d4193662-5c4e-40fa-ac9e-495509e75c4a/manager/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.362959 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-h7k72_3dce5b40-fa36-4a03-bea2-a19e6267ecec/operator/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.400561 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-m6ldz_46a0519d-251f-48e0-9d65-4ca08c627195/kube-rbac-proxy/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.450763 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dd9d44468-66k2c_eb2409f1-4af4-49a3-a453-29e8f447360e/manager/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.520501 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-m6ldz_46a0519d-251f-48e0-9d65-4ca08c627195/manager/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.615437 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-775776c574-h2f5k_58991c0e-b29b-4851-b5b0-8327380e1320/kube-rbac-proxy/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.646949 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-775776c574-h2f5k_58991c0e-b29b-4851-b5b0-8327380e1320/manager/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.700109 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-tw98s_19754c8c-8fbf-4b8e-b673-462e22ec11d1/manager/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.725340 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-tw98s_19754c8c-8fbf-4b8e-b673-462e22ec11d1/kube-rbac-proxy/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.839783 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-qfbh5_76773a34-db21-4354-a16e-e70ea0d6d63d/kube-rbac-proxy/0.log" Oct 08 21:44:26 crc kubenswrapper[4669]: I1008 21:44:26.888786 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-qfbh5_76773a34-db21-4354-a16e-e70ea0d6d63d/manager/0.log" Oct 08 21:44:34 crc kubenswrapper[4669]: I1008 21:44:34.331010 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:44:34 crc kubenswrapper[4669]: E1008 21:44:34.332783 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:44:42 crc kubenswrapper[4669]: I1008 21:44:42.077192 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9vmlr_d0da4600-abf4-4f3e-8299-a269b29ca44a/control-plane-machine-set-operator/0.log" Oct 08 21:44:42 crc kubenswrapper[4669]: I1008 21:44:42.246398 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rtm6r_e45ff91e-f07d-489b-b0a0-8e815bdf41c3/machine-api-operator/0.log" Oct 08 21:44:42 crc kubenswrapper[4669]: I1008 21:44:42.246810 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rtm6r_e45ff91e-f07d-489b-b0a0-8e815bdf41c3/kube-rbac-proxy/0.log" Oct 08 21:44:45 crc kubenswrapper[4669]: I1008 21:44:45.331222 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:44:45 crc kubenswrapper[4669]: E1008 21:44:45.332021 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:44:53 crc kubenswrapper[4669]: I1008 21:44:53.659769 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-btc2c_7a8b2c5d-38a2-4866-a1e0-8df9b4659c15/cert-manager-controller/0.log" Oct 08 21:44:53 crc kubenswrapper[4669]: I1008 21:44:53.767268 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-7g9zf_c726284c-5ca2-4d63-b96e-56c1aa537986/cert-manager-cainjector/0.log" Oct 08 21:44:53 crc kubenswrapper[4669]: I1008 21:44:53.813007 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9flwc_314fd865-0086-4bd8-8a90-0c992557a6af/cert-manager-webhook/0.log" Oct 08 21:44:58 crc kubenswrapper[4669]: I1008 21:44:58.331354 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:44:58 crc kubenswrapper[4669]: E1008 21:44:58.332062 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.179295 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss"] Oct 08 21:45:00 crc kubenswrapper[4669]: E1008 21:45:00.180152 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eecb272-61d7-4c17-940b-aa95f6c37dff" containerName="container-00" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.180167 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eecb272-61d7-4c17-940b-aa95f6c37dff" containerName="container-00" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.180327 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eecb272-61d7-4c17-940b-aa95f6c37dff" containerName="container-00" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.180962 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.183414 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.184155 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.189482 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss"] Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.358594 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac555100-e1f1-435a-964b-7b072aa32045-config-volume\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.358827 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac555100-e1f1-435a-964b-7b072aa32045-secret-volume\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.358858 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59lp6\" (UniqueName: \"kubernetes.io/projected/ac555100-e1f1-435a-964b-7b072aa32045-kube-api-access-59lp6\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.460680 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac555100-e1f1-435a-964b-7b072aa32045-config-volume\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.460734 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac555100-e1f1-435a-964b-7b072aa32045-secret-volume\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.460764 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59lp6\" (UniqueName: \"kubernetes.io/projected/ac555100-e1f1-435a-964b-7b072aa32045-kube-api-access-59lp6\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.461682 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac555100-e1f1-435a-964b-7b072aa32045-config-volume\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.471406 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac555100-e1f1-435a-964b-7b072aa32045-secret-volume\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.481753 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59lp6\" (UniqueName: \"kubernetes.io/projected/ac555100-e1f1-435a-964b-7b072aa32045-kube-api-access-59lp6\") pod \"collect-profiles-29332665-kxqss\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.514038 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:00 crc kubenswrapper[4669]: I1008 21:45:00.971373 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss"] Oct 08 21:45:01 crc kubenswrapper[4669]: I1008 21:45:01.920539 4669 generic.go:334] "Generic (PLEG): container finished" podID="ac555100-e1f1-435a-964b-7b072aa32045" containerID="a216dccbba7ec52298faa9a49fd4195addd4d239e396ee7cb2bf57d7e21addf4" exitCode=0 Oct 08 21:45:01 crc kubenswrapper[4669]: I1008 21:45:01.920650 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" event={"ID":"ac555100-e1f1-435a-964b-7b072aa32045","Type":"ContainerDied","Data":"a216dccbba7ec52298faa9a49fd4195addd4d239e396ee7cb2bf57d7e21addf4"} Oct 08 21:45:01 crc kubenswrapper[4669]: I1008 21:45:01.921027 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" event={"ID":"ac555100-e1f1-435a-964b-7b072aa32045","Type":"ContainerStarted","Data":"f804b6ae73fb48680509f8a977d3a26993f74eeb8f81a6170fdccf9ad903b649"} Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.351364 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.527027 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59lp6\" (UniqueName: \"kubernetes.io/projected/ac555100-e1f1-435a-964b-7b072aa32045-kube-api-access-59lp6\") pod \"ac555100-e1f1-435a-964b-7b072aa32045\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.527240 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac555100-e1f1-435a-964b-7b072aa32045-secret-volume\") pod \"ac555100-e1f1-435a-964b-7b072aa32045\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.527347 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac555100-e1f1-435a-964b-7b072aa32045-config-volume\") pod \"ac555100-e1f1-435a-964b-7b072aa32045\" (UID: \"ac555100-e1f1-435a-964b-7b072aa32045\") " Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.528821 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac555100-e1f1-435a-964b-7b072aa32045-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac555100-e1f1-435a-964b-7b072aa32045" (UID: "ac555100-e1f1-435a-964b-7b072aa32045"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.536220 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac555100-e1f1-435a-964b-7b072aa32045-kube-api-access-59lp6" (OuterVolumeSpecName: "kube-api-access-59lp6") pod "ac555100-e1f1-435a-964b-7b072aa32045" (UID: "ac555100-e1f1-435a-964b-7b072aa32045"). InnerVolumeSpecName "kube-api-access-59lp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.536816 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac555100-e1f1-435a-964b-7b072aa32045-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac555100-e1f1-435a-964b-7b072aa32045" (UID: "ac555100-e1f1-435a-964b-7b072aa32045"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.630411 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59lp6\" (UniqueName: \"kubernetes.io/projected/ac555100-e1f1-435a-964b-7b072aa32045-kube-api-access-59lp6\") on node \"crc\" DevicePath \"\"" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.630780 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac555100-e1f1-435a-964b-7b072aa32045-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.630789 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac555100-e1f1-435a-964b-7b072aa32045-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.954687 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" event={"ID":"ac555100-e1f1-435a-964b-7b072aa32045","Type":"ContainerDied","Data":"f804b6ae73fb48680509f8a977d3a26993f74eeb8f81a6170fdccf9ad903b649"} Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.954745 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f804b6ae73fb48680509f8a977d3a26993f74eeb8f81a6170fdccf9ad903b649" Oct 08 21:45:03 crc kubenswrapper[4669]: I1008 21:45:03.954853 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-kxqss" Oct 08 21:45:04 crc kubenswrapper[4669]: I1008 21:45:04.431155 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h"] Oct 08 21:45:04 crc kubenswrapper[4669]: I1008 21:45:04.441793 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332620-8mh8h"] Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.341918 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7067c4a6-69cd-456c-aa78-81f45c8cdf7e" path="/var/lib/kubelet/pods/7067c4a6-69cd-456c-aa78-81f45c8cdf7e/volumes" Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.386258 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-4ddnj_e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53/nmstate-console-plugin/0.log" Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.563514 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dr78d_393d89be-66bc-40a8-99e2-a145ec3eebe8/nmstate-handler/0.log" Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.669265 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-nqf8s_e975e2d7-d104-4d94-a624-e2bd680e2e23/kube-rbac-proxy/0.log" Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.695353 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-nqf8s_e975e2d7-d104-4d94-a624-e2bd680e2e23/nmstate-metrics/0.log" Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.797522 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bmpqk_e3e2cb2d-8c68-46d0-a639-fd839c30a680/nmstate-operator/0.log" Oct 08 21:45:05 crc kubenswrapper[4669]: I1008 21:45:05.914477 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-57cwq_d67c051f-00e6-45c7-aa07-401695d5798f/nmstate-webhook/0.log" Oct 08 21:45:13 crc kubenswrapper[4669]: I1008 21:45:13.330677 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:45:13 crc kubenswrapper[4669]: E1008 21:45:13.331373 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:45:18 crc kubenswrapper[4669]: I1008 21:45:18.834064 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-jlbrt_80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6/kube-rbac-proxy/0.log" Oct 08 21:45:18 crc kubenswrapper[4669]: I1008 21:45:18.898924 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-jlbrt_80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6/controller/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.038384 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.175262 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.189038 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.228431 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.238182 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.386132 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.415619 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.416194 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.425214 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.600360 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.605556 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.615890 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/controller/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.644608 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.776320 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/kube-rbac-proxy/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.779985 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/frr-metrics/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.858876 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/kube-rbac-proxy-frr/0.log" Oct 08 21:45:19 crc kubenswrapper[4669]: I1008 21:45:19.944819 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/reloader/0.log" Oct 08 21:45:20 crc kubenswrapper[4669]: I1008 21:45:20.023009 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-gwhr7_c74dd47d-12ee-4626-a02f-e8dc07f26791/frr-k8s-webhook-server/0.log" Oct 08 21:45:20 crc kubenswrapper[4669]: I1008 21:45:20.277334 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6f6598d79f-8tfml_0304e30c-72d1-4544-9f05-fb7acc1c3c61/manager/0.log" Oct 08 21:45:20 crc kubenswrapper[4669]: I1008 21:45:20.401155 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b6f8f8d99-l255w_dbe12456-371e-4274-911b-264b27260e4e/webhook-server/0.log" Oct 08 21:45:20 crc kubenswrapper[4669]: I1008 21:45:20.456095 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x4jbl_91c83ac7-3c70-4fca-85b1-9ee4d9dd4568/kube-rbac-proxy/0.log" Oct 08 21:45:21 crc kubenswrapper[4669]: I1008 21:45:21.069999 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x4jbl_91c83ac7-3c70-4fca-85b1-9ee4d9dd4568/speaker/0.log" Oct 08 21:45:21 crc kubenswrapper[4669]: I1008 21:45:21.200888 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/frr/0.log" Oct 08 21:45:25 crc kubenswrapper[4669]: I1008 21:45:25.332204 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:45:25 crc kubenswrapper[4669]: E1008 21:45:25.352341 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:45:33 crc kubenswrapper[4669]: I1008 21:45:33.583380 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/util/0.log" Oct 08 21:45:33 crc kubenswrapper[4669]: I1008 21:45:33.763434 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/pull/0.log" Oct 08 21:45:33 crc kubenswrapper[4669]: I1008 21:45:33.792849 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/util/0.log" Oct 08 21:45:33 crc kubenswrapper[4669]: I1008 21:45:33.813008 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/pull/0.log" Oct 08 21:45:33 crc kubenswrapper[4669]: I1008 21:45:33.987278 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/util/0.log" Oct 08 21:45:33 crc kubenswrapper[4669]: I1008 21:45:33.994099 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/extract/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.005403 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/pull/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.171222 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-utilities/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.378907 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-utilities/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.418930 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-content/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.435212 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-content/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.619721 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-utilities/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.636099 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-content/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.794706 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-utilities/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.823588 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/registry-server/0.log" Oct 08 21:45:34 crc kubenswrapper[4669]: I1008 21:45:34.995494 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-content/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.011040 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-content/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.021777 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-utilities/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.177233 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-utilities/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.220196 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-content/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.425386 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/util/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.689279 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/util/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.692146 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/pull/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.705091 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/pull/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.920018 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/registry-server/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.963628 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/pull/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.984832 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/extract/0.log" Oct 08 21:45:35 crc kubenswrapper[4669]: I1008 21:45:35.987675 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/util/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.305376 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kjxjv_c10f6dc4-f608-4960-91ab-cfdebb79b8ff/marketplace-operator/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.311903 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-utilities/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.330421 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:45:36 crc kubenswrapper[4669]: E1008 21:45:36.330661 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.516996 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-utilities/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.527591 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-content/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.537985 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-content/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.745771 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-content/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.746280 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-utilities/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.843489 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/registry-server/0.log" Oct 08 21:45:36 crc kubenswrapper[4669]: I1008 21:45:36.920069 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-utilities/0.log" Oct 08 21:45:37 crc kubenswrapper[4669]: I1008 21:45:37.078938 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-utilities/0.log" Oct 08 21:45:37 crc kubenswrapper[4669]: I1008 21:45:37.098477 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-content/0.log" Oct 08 21:45:37 crc kubenswrapper[4669]: I1008 21:45:37.107052 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-content/0.log" Oct 08 21:45:37 crc kubenswrapper[4669]: I1008 21:45:37.260638 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-utilities/0.log" Oct 08 21:45:37 crc kubenswrapper[4669]: I1008 21:45:37.282855 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-content/0.log" Oct 08 21:45:37 crc kubenswrapper[4669]: I1008 21:45:37.760024 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/registry-server/0.log" Oct 08 21:45:50 crc kubenswrapper[4669]: I1008 21:45:50.331168 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:45:50 crc kubenswrapper[4669]: E1008 21:45:50.332269 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:46:01 crc kubenswrapper[4669]: I1008 21:46:01.336837 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:46:01 crc kubenswrapper[4669]: E1008 21:46:01.337656 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:46:05 crc kubenswrapper[4669]: I1008 21:46:05.014866 4669 scope.go:117] "RemoveContainer" containerID="dbc338d1f8d0b38e3f5c5b7f90457ffa425bdeddacac637f15042e29928ab373" Oct 08 21:46:14 crc kubenswrapper[4669]: I1008 21:46:14.332350 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:46:14 crc kubenswrapper[4669]: E1008 21:46:14.333965 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:46:23 crc kubenswrapper[4669]: I1008 21:46:23.445658 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-5f9445f759-bx7xs" podUID="895cb1ba-c212-4908-82e9-f5042a50686f" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 08 21:46:26 crc kubenswrapper[4669]: I1008 21:46:26.331774 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:46:26 crc kubenswrapper[4669]: E1008 21:46:26.334499 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:46:41 crc kubenswrapper[4669]: I1008 21:46:41.337162 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:46:41 crc kubenswrapper[4669]: E1008 21:46:41.337984 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:46:56 crc kubenswrapper[4669]: I1008 21:46:56.331403 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:46:56 crc kubenswrapper[4669]: E1008 21:46:56.332197 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:47:07 crc kubenswrapper[4669]: I1008 21:47:07.332394 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:47:07 crc kubenswrapper[4669]: E1008 21:47:07.333947 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:47:12 crc kubenswrapper[4669]: I1008 21:47:12.370699 4669 generic.go:334] "Generic (PLEG): container finished" podID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerID="595c2359c028040ae502e802137f1eed665d80bb2cb0b7d64cbdeb6d524c0860" exitCode=0 Oct 08 21:47:12 crc kubenswrapper[4669]: I1008 21:47:12.370883 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" event={"ID":"345c331b-742a-4722-9868-d2cc6cf5e7cb","Type":"ContainerDied","Data":"595c2359c028040ae502e802137f1eed665d80bb2cb0b7d64cbdeb6d524c0860"} Oct 08 21:47:12 crc kubenswrapper[4669]: I1008 21:47:12.371802 4669 scope.go:117] "RemoveContainer" containerID="595c2359c028040ae502e802137f1eed665d80bb2cb0b7d64cbdeb6d524c0860" Oct 08 21:47:12 crc kubenswrapper[4669]: I1008 21:47:12.725668 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pmrzj_must-gather-jfxtf_345c331b-742a-4722-9868-d2cc6cf5e7cb/gather/0.log" Oct 08 21:47:19 crc kubenswrapper[4669]: I1008 21:47:19.331674 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:47:19 crc kubenswrapper[4669]: E1008 21:47:19.332501 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.280988 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pmrzj/must-gather-jfxtf"] Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.281229 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="copy" containerID="cri-o://2211d8810c8b6fe7d02036d6ee73e0f9792ee4caa0dd63d183dc3de302d28f25" gracePeriod=2 Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.294025 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pmrzj/must-gather-jfxtf"] Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.456886 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pmrzj_must-gather-jfxtf_345c331b-742a-4722-9868-d2cc6cf5e7cb/copy/0.log" Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.457967 4669 generic.go:334] "Generic (PLEG): container finished" podID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerID="2211d8810c8b6fe7d02036d6ee73e0f9792ee4caa0dd63d183dc3de302d28f25" exitCode=143 Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.761080 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pmrzj_must-gather-jfxtf_345c331b-742a-4722-9868-d2cc6cf5e7cb/copy/0.log" Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.761643 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.860600 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ghdd\" (UniqueName: \"kubernetes.io/projected/345c331b-742a-4722-9868-d2cc6cf5e7cb-kube-api-access-2ghdd\") pod \"345c331b-742a-4722-9868-d2cc6cf5e7cb\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.860791 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/345c331b-742a-4722-9868-d2cc6cf5e7cb-must-gather-output\") pod \"345c331b-742a-4722-9868-d2cc6cf5e7cb\" (UID: \"345c331b-742a-4722-9868-d2cc6cf5e7cb\") " Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.867608 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345c331b-742a-4722-9868-d2cc6cf5e7cb-kube-api-access-2ghdd" (OuterVolumeSpecName: "kube-api-access-2ghdd") pod "345c331b-742a-4722-9868-d2cc6cf5e7cb" (UID: "345c331b-742a-4722-9868-d2cc6cf5e7cb"). InnerVolumeSpecName "kube-api-access-2ghdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.962997 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ghdd\" (UniqueName: \"kubernetes.io/projected/345c331b-742a-4722-9868-d2cc6cf5e7cb-kube-api-access-2ghdd\") on node \"crc\" DevicePath \"\"" Oct 08 21:47:20 crc kubenswrapper[4669]: I1008 21:47:20.991499 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/345c331b-742a-4722-9868-d2cc6cf5e7cb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "345c331b-742a-4722-9868-d2cc6cf5e7cb" (UID: "345c331b-742a-4722-9868-d2cc6cf5e7cb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:47:21 crc kubenswrapper[4669]: I1008 21:47:21.064487 4669 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/345c331b-742a-4722-9868-d2cc6cf5e7cb-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 21:47:21 crc kubenswrapper[4669]: I1008 21:47:21.341485 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" path="/var/lib/kubelet/pods/345c331b-742a-4722-9868-d2cc6cf5e7cb/volumes" Oct 08 21:47:21 crc kubenswrapper[4669]: I1008 21:47:21.468962 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pmrzj_must-gather-jfxtf_345c331b-742a-4722-9868-d2cc6cf5e7cb/copy/0.log" Oct 08 21:47:21 crc kubenswrapper[4669]: I1008 21:47:21.469326 4669 scope.go:117] "RemoveContainer" containerID="2211d8810c8b6fe7d02036d6ee73e0f9792ee4caa0dd63d183dc3de302d28f25" Oct 08 21:47:21 crc kubenswrapper[4669]: I1008 21:47:21.469440 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pmrzj/must-gather-jfxtf" Oct 08 21:47:21 crc kubenswrapper[4669]: I1008 21:47:21.491075 4669 scope.go:117] "RemoveContainer" containerID="595c2359c028040ae502e802137f1eed665d80bb2cb0b7d64cbdeb6d524c0860" Oct 08 21:47:31 crc kubenswrapper[4669]: I1008 21:47:31.338871 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:47:31 crc kubenswrapper[4669]: E1008 21:47:31.339761 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:47:46 crc kubenswrapper[4669]: I1008 21:47:46.330843 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:47:46 crc kubenswrapper[4669]: E1008 21:47:46.331962 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.675631 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8t5fb/must-gather-z8mk6"] Oct 08 21:47:56 crc kubenswrapper[4669]: E1008 21:47:56.676617 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac555100-e1f1-435a-964b-7b072aa32045" containerName="collect-profiles" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.676638 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac555100-e1f1-435a-964b-7b072aa32045" containerName="collect-profiles" Oct 08 21:47:56 crc kubenswrapper[4669]: E1008 21:47:56.676726 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="copy" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.676739 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="copy" Oct 08 21:47:56 crc kubenswrapper[4669]: E1008 21:47:56.676768 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="gather" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.676779 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="gather" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.677084 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac555100-e1f1-435a-964b-7b072aa32045" containerName="collect-profiles" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.677112 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="copy" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.677132 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="345c331b-742a-4722-9868-d2cc6cf5e7cb" containerName="gather" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.684283 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.686271 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8t5fb"/"kube-root-ca.crt" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.687238 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8t5fb"/"openshift-service-ca.crt" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.745797 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8t5fb/must-gather-z8mk6"] Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.812457 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwvd\" (UniqueName: \"kubernetes.io/projected/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-kube-api-access-pbwvd\") pod \"must-gather-z8mk6\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.812806 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-must-gather-output\") pod \"must-gather-z8mk6\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.914234 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwvd\" (UniqueName: \"kubernetes.io/projected/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-kube-api-access-pbwvd\") pod \"must-gather-z8mk6\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.914306 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-must-gather-output\") pod \"must-gather-z8mk6\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.914775 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-must-gather-output\") pod \"must-gather-z8mk6\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:56 crc kubenswrapper[4669]: I1008 21:47:56.940686 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwvd\" (UniqueName: \"kubernetes.io/projected/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-kube-api-access-pbwvd\") pod \"must-gather-z8mk6\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:57 crc kubenswrapper[4669]: I1008 21:47:57.004423 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:47:57 crc kubenswrapper[4669]: I1008 21:47:57.535038 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8t5fb/must-gather-z8mk6"] Oct 08 21:47:57 crc kubenswrapper[4669]: I1008 21:47:57.859197 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" event={"ID":"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9","Type":"ContainerStarted","Data":"8642cf7fbabbbea29d36ff683aa0d809d2795dab562c213e4e4c3ea8eac788a0"} Oct 08 21:47:57 crc kubenswrapper[4669]: I1008 21:47:57.859247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" event={"ID":"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9","Type":"ContainerStarted","Data":"f961a374b61ee92fd36c2404cf7f83a19b0989685fc49f8da727edb0624d1102"} Oct 08 21:47:58 crc kubenswrapper[4669]: I1008 21:47:58.867910 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" event={"ID":"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9","Type":"ContainerStarted","Data":"697b1f9a0d37f0a8869cac32774d206cbe221fa08c2c9483b44988fd9e4b5154"} Oct 08 21:47:58 crc kubenswrapper[4669]: I1008 21:47:58.885384 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" podStartSLOduration=2.885367983 podStartE2EDuration="2.885367983s" podCreationTimestamp="2025-10-08 21:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:47:58.880193088 +0000 UTC m=+3798.573003771" watchObservedRunningTime="2025-10-08 21:47:58.885367983 +0000 UTC m=+3798.578178656" Oct 08 21:48:00 crc kubenswrapper[4669]: I1008 21:48:00.995407 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-47rck"] Oct 08 21:48:00 crc kubenswrapper[4669]: I1008 21:48:00.997206 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:00 crc kubenswrapper[4669]: I1008 21:48:00.999651 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8t5fb"/"default-dockercfg-qhdhb" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.104392 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r584d\" (UniqueName: \"kubernetes.io/projected/cf5ed82e-7eae-4891-821c-619d6bca2d22-kube-api-access-r584d\") pod \"crc-debug-47rck\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.104654 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5ed82e-7eae-4891-821c-619d6bca2d22-host\") pod \"crc-debug-47rck\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.206723 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5ed82e-7eae-4891-821c-619d6bca2d22-host\") pod \"crc-debug-47rck\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.206782 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r584d\" (UniqueName: \"kubernetes.io/projected/cf5ed82e-7eae-4891-821c-619d6bca2d22-kube-api-access-r584d\") pod \"crc-debug-47rck\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.206889 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5ed82e-7eae-4891-821c-619d6bca2d22-host\") pod \"crc-debug-47rck\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.229316 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r584d\" (UniqueName: \"kubernetes.io/projected/cf5ed82e-7eae-4891-821c-619d6bca2d22-kube-api-access-r584d\") pod \"crc-debug-47rck\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.326715 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.337055 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:48:01 crc kubenswrapper[4669]: E1008 21:48:01.337306 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:48:01 crc kubenswrapper[4669]: W1008 21:48:01.377918 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5ed82e_7eae_4891_821c_619d6bca2d22.slice/crio-fb5f6ab07b353a9fb586ea12097058c13482052edc75ad6c36900dbf37b76a9a WatchSource:0}: Error finding container fb5f6ab07b353a9fb586ea12097058c13482052edc75ad6c36900dbf37b76a9a: Status 404 returned error can't find the container with id fb5f6ab07b353a9fb586ea12097058c13482052edc75ad6c36900dbf37b76a9a Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.894339 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-47rck" event={"ID":"cf5ed82e-7eae-4891-821c-619d6bca2d22","Type":"ContainerStarted","Data":"8e987046b8afde49c6f65d45a0dc1bb49f6af4e9049fd78171edb59083165568"} Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.894836 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-47rck" event={"ID":"cf5ed82e-7eae-4891-821c-619d6bca2d22","Type":"ContainerStarted","Data":"fb5f6ab07b353a9fb586ea12097058c13482052edc75ad6c36900dbf37b76a9a"} Oct 08 21:48:01 crc kubenswrapper[4669]: I1008 21:48:01.916292 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8t5fb/crc-debug-47rck" podStartSLOduration=1.916272084 podStartE2EDuration="1.916272084s" podCreationTimestamp="2025-10-08 21:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:48:01.907096517 +0000 UTC m=+3801.599907200" watchObservedRunningTime="2025-10-08 21:48:01.916272084 +0000 UTC m=+3801.609082767" Oct 08 21:48:15 crc kubenswrapper[4669]: I1008 21:48:15.331475 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:48:15 crc kubenswrapper[4669]: E1008 21:48:15.332386 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.804070 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tsxs4"] Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.806271 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.860012 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtpl8\" (UniqueName: \"kubernetes.io/projected/8e6ad776-df59-4a8c-9ee7-54120fe14a82-kube-api-access-dtpl8\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.860129 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-utilities\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.860204 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-catalog-content\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.885006 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tsxs4"] Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.961999 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtpl8\" (UniqueName: \"kubernetes.io/projected/8e6ad776-df59-4a8c-9ee7-54120fe14a82-kube-api-access-dtpl8\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.962105 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-utilities\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.962171 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-catalog-content\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.963029 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-utilities\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.963079 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-catalog-content\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:19 crc kubenswrapper[4669]: I1008 21:48:19.987374 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtpl8\" (UniqueName: \"kubernetes.io/projected/8e6ad776-df59-4a8c-9ee7-54120fe14a82-kube-api-access-dtpl8\") pod \"redhat-operators-tsxs4\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:20 crc kubenswrapper[4669]: I1008 21:48:20.191482 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:20 crc kubenswrapper[4669]: I1008 21:48:20.717872 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tsxs4"] Oct 08 21:48:21 crc kubenswrapper[4669]: I1008 21:48:21.051700 4669 generic.go:334] "Generic (PLEG): container finished" podID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerID="84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6" exitCode=0 Oct 08 21:48:21 crc kubenswrapper[4669]: I1008 21:48:21.051824 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerDied","Data":"84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6"} Oct 08 21:48:21 crc kubenswrapper[4669]: I1008 21:48:21.052085 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerStarted","Data":"21597b9563324dfdccede2f6988495acc6c4cfb0e0997ef50eafecf5e6bfe09f"} Oct 08 21:48:21 crc kubenswrapper[4669]: I1008 21:48:21.053393 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.380337 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rbw4r"] Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.384401 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.405271 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbw4r"] Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.510024 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-catalog-content\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.510339 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-utilities\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.510450 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf86w\" (UniqueName: \"kubernetes.io/projected/542ef893-30c0-42f1-b78e-3dfb07b93eec-kube-api-access-sf86w\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.612661 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-catalog-content\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.612997 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-utilities\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.613105 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf86w\" (UniqueName: \"kubernetes.io/projected/542ef893-30c0-42f1-b78e-3dfb07b93eec-kube-api-access-sf86w\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.613251 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-catalog-content\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.613450 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-utilities\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.632882 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf86w\" (UniqueName: \"kubernetes.io/projected/542ef893-30c0-42f1-b78e-3dfb07b93eec-kube-api-access-sf86w\") pod \"certified-operators-rbw4r\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:22 crc kubenswrapper[4669]: I1008 21:48:22.708859 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:23 crc kubenswrapper[4669]: I1008 21:48:23.071096 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerStarted","Data":"9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721"} Oct 08 21:48:23 crc kubenswrapper[4669]: I1008 21:48:23.236744 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbw4r"] Oct 08 21:48:23 crc kubenswrapper[4669]: W1008 21:48:23.238907 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542ef893_30c0_42f1_b78e_3dfb07b93eec.slice/crio-cdbf48a52afdc30984fbd7631566c38f83d9e956d9ee59f66f4bff11d56079cf WatchSource:0}: Error finding container cdbf48a52afdc30984fbd7631566c38f83d9e956d9ee59f66f4bff11d56079cf: Status 404 returned error can't find the container with id cdbf48a52afdc30984fbd7631566c38f83d9e956d9ee59f66f4bff11d56079cf Oct 08 21:48:24 crc kubenswrapper[4669]: I1008 21:48:24.081233 4669 generic.go:334] "Generic (PLEG): container finished" podID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerID="ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c" exitCode=0 Oct 08 21:48:24 crc kubenswrapper[4669]: I1008 21:48:24.081330 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerDied","Data":"ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c"} Oct 08 21:48:24 crc kubenswrapper[4669]: I1008 21:48:24.081585 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerStarted","Data":"cdbf48a52afdc30984fbd7631566c38f83d9e956d9ee59f66f4bff11d56079cf"} Oct 08 21:48:25 crc kubenswrapper[4669]: I1008 21:48:25.097328 4669 generic.go:334] "Generic (PLEG): container finished" podID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerID="9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721" exitCode=0 Oct 08 21:48:25 crc kubenswrapper[4669]: I1008 21:48:25.097400 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerDied","Data":"9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721"} Oct 08 21:48:26 crc kubenswrapper[4669]: I1008 21:48:26.107194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerStarted","Data":"d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be"} Oct 08 21:48:26 crc kubenswrapper[4669]: I1008 21:48:26.110123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerStarted","Data":"e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2"} Oct 08 21:48:26 crc kubenswrapper[4669]: I1008 21:48:26.154890 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tsxs4" podStartSLOduration=2.67782143 podStartE2EDuration="7.154873049s" podCreationTimestamp="2025-10-08 21:48:19 +0000 UTC" firstStartedPulling="2025-10-08 21:48:21.05318302 +0000 UTC m=+3820.745993693" lastFinishedPulling="2025-10-08 21:48:25.530234639 +0000 UTC m=+3825.223045312" observedRunningTime="2025-10-08 21:48:26.150744593 +0000 UTC m=+3825.843555266" watchObservedRunningTime="2025-10-08 21:48:26.154873049 +0000 UTC m=+3825.847683722" Oct 08 21:48:28 crc kubenswrapper[4669]: I1008 21:48:28.133671 4669 generic.go:334] "Generic (PLEG): container finished" podID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerID="d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be" exitCode=0 Oct 08 21:48:28 crc kubenswrapper[4669]: I1008 21:48:28.133823 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerDied","Data":"d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be"} Oct 08 21:48:29 crc kubenswrapper[4669]: I1008 21:48:29.144193 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerStarted","Data":"ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386"} Oct 08 21:48:29 crc kubenswrapper[4669]: I1008 21:48:29.163915 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rbw4r" podStartSLOduration=2.62173983 podStartE2EDuration="7.163900128s" podCreationTimestamp="2025-10-08 21:48:22 +0000 UTC" firstStartedPulling="2025-10-08 21:48:24.083245938 +0000 UTC m=+3823.776056611" lastFinishedPulling="2025-10-08 21:48:28.625406226 +0000 UTC m=+3828.318216909" observedRunningTime="2025-10-08 21:48:29.163046883 +0000 UTC m=+3828.855857556" watchObservedRunningTime="2025-10-08 21:48:29.163900128 +0000 UTC m=+3828.856710801" Oct 08 21:48:29 crc kubenswrapper[4669]: I1008 21:48:29.330654 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:48:29 crc kubenswrapper[4669]: E1008 21:48:29.330890 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:48:30 crc kubenswrapper[4669]: I1008 21:48:30.192736 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:30 crc kubenswrapper[4669]: I1008 21:48:30.193053 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:31 crc kubenswrapper[4669]: I1008 21:48:31.245996 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tsxs4" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="registry-server" probeResult="failure" output=< Oct 08 21:48:31 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 08 21:48:31 crc kubenswrapper[4669]: > Oct 08 21:48:32 crc kubenswrapper[4669]: I1008 21:48:32.709443 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:32 crc kubenswrapper[4669]: I1008 21:48:32.709804 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:32 crc kubenswrapper[4669]: I1008 21:48:32.765706 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:33 crc kubenswrapper[4669]: I1008 21:48:33.231370 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:33 crc kubenswrapper[4669]: I1008 21:48:33.371719 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rbw4r"] Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.198598 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rbw4r" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="registry-server" containerID="cri-o://ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386" gracePeriod=2 Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.633441 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.812845 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf86w\" (UniqueName: \"kubernetes.io/projected/542ef893-30c0-42f1-b78e-3dfb07b93eec-kube-api-access-sf86w\") pod \"542ef893-30c0-42f1-b78e-3dfb07b93eec\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.813106 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-catalog-content\") pod \"542ef893-30c0-42f1-b78e-3dfb07b93eec\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.813228 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-utilities\") pod \"542ef893-30c0-42f1-b78e-3dfb07b93eec\" (UID: \"542ef893-30c0-42f1-b78e-3dfb07b93eec\") " Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.813716 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-utilities" (OuterVolumeSpecName: "utilities") pod "542ef893-30c0-42f1-b78e-3dfb07b93eec" (UID: "542ef893-30c0-42f1-b78e-3dfb07b93eec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.842809 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542ef893-30c0-42f1-b78e-3dfb07b93eec-kube-api-access-sf86w" (OuterVolumeSpecName: "kube-api-access-sf86w") pod "542ef893-30c0-42f1-b78e-3dfb07b93eec" (UID: "542ef893-30c0-42f1-b78e-3dfb07b93eec"). InnerVolumeSpecName "kube-api-access-sf86w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.868562 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "542ef893-30c0-42f1-b78e-3dfb07b93eec" (UID: "542ef893-30c0-42f1-b78e-3dfb07b93eec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.915873 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.915908 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542ef893-30c0-42f1-b78e-3dfb07b93eec-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:35 crc kubenswrapper[4669]: I1008 21:48:35.915919 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf86w\" (UniqueName: \"kubernetes.io/projected/542ef893-30c0-42f1-b78e-3dfb07b93eec-kube-api-access-sf86w\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.208663 4669 generic.go:334] "Generic (PLEG): container finished" podID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerID="ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386" exitCode=0 Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.208703 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerDied","Data":"ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386"} Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.208728 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbw4r" event={"ID":"542ef893-30c0-42f1-b78e-3dfb07b93eec","Type":"ContainerDied","Data":"cdbf48a52afdc30984fbd7631566c38f83d9e956d9ee59f66f4bff11d56079cf"} Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.208745 4669 scope.go:117] "RemoveContainer" containerID="ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.208850 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbw4r" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.250581 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rbw4r"] Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.258950 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rbw4r"] Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.261683 4669 scope.go:117] "RemoveContainer" containerID="d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.306983 4669 scope.go:117] "RemoveContainer" containerID="ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.349865 4669 scope.go:117] "RemoveContainer" containerID="ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386" Oct 08 21:48:36 crc kubenswrapper[4669]: E1008 21:48:36.350366 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386\": container with ID starting with ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386 not found: ID does not exist" containerID="ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.350394 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386"} err="failed to get container status \"ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386\": rpc error: code = NotFound desc = could not find container \"ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386\": container with ID starting with ab728217727e43d4ebc015de1b666b41d757fb69abe818f87a71506ad1c7d386 not found: ID does not exist" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.350415 4669 scope.go:117] "RemoveContainer" containerID="d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be" Oct 08 21:48:36 crc kubenswrapper[4669]: E1008 21:48:36.350671 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be\": container with ID starting with d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be not found: ID does not exist" containerID="d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.350710 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be"} err="failed to get container status \"d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be\": rpc error: code = NotFound desc = could not find container \"d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be\": container with ID starting with d4bfdb1d024dfcba24a915cc8f70d3d91b9ac70f2b529da9d587bcda606b47be not found: ID does not exist" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.350723 4669 scope.go:117] "RemoveContainer" containerID="ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c" Oct 08 21:48:36 crc kubenswrapper[4669]: E1008 21:48:36.350995 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c\": container with ID starting with ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c not found: ID does not exist" containerID="ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c" Oct 08 21:48:36 crc kubenswrapper[4669]: I1008 21:48:36.351015 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c"} err="failed to get container status \"ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c\": rpc error: code = NotFound desc = could not find container \"ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c\": container with ID starting with ad08f8e19cc78e60d17f2cd52e5acd1b3aec82e58627904319ba4a32e2abb87c not found: ID does not exist" Oct 08 21:48:37 crc kubenswrapper[4669]: I1008 21:48:37.218792 4669 generic.go:334] "Generic (PLEG): container finished" podID="cf5ed82e-7eae-4891-821c-619d6bca2d22" containerID="8e987046b8afde49c6f65d45a0dc1bb49f6af4e9049fd78171edb59083165568" exitCode=0 Oct 08 21:48:37 crc kubenswrapper[4669]: I1008 21:48:37.218867 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-47rck" event={"ID":"cf5ed82e-7eae-4891-821c-619d6bca2d22","Type":"ContainerDied","Data":"8e987046b8afde49c6f65d45a0dc1bb49f6af4e9049fd78171edb59083165568"} Oct 08 21:48:37 crc kubenswrapper[4669]: I1008 21:48:37.345283 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" path="/var/lib/kubelet/pods/542ef893-30c0-42f1-b78e-3dfb07b93eec/volumes" Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.328243 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.358429 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-47rck"] Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.367465 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-47rck"] Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.459760 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r584d\" (UniqueName: \"kubernetes.io/projected/cf5ed82e-7eae-4891-821c-619d6bca2d22-kube-api-access-r584d\") pod \"cf5ed82e-7eae-4891-821c-619d6bca2d22\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.459953 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5ed82e-7eae-4891-821c-619d6bca2d22-host\") pod \"cf5ed82e-7eae-4891-821c-619d6bca2d22\" (UID: \"cf5ed82e-7eae-4891-821c-619d6bca2d22\") " Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.460082 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf5ed82e-7eae-4891-821c-619d6bca2d22-host" (OuterVolumeSpecName: "host") pod "cf5ed82e-7eae-4891-821c-619d6bca2d22" (UID: "cf5ed82e-7eae-4891-821c-619d6bca2d22"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.460755 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5ed82e-7eae-4891-821c-619d6bca2d22-host\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.464536 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5ed82e-7eae-4891-821c-619d6bca2d22-kube-api-access-r584d" (OuterVolumeSpecName: "kube-api-access-r584d") pod "cf5ed82e-7eae-4891-821c-619d6bca2d22" (UID: "cf5ed82e-7eae-4891-821c-619d6bca2d22"). InnerVolumeSpecName "kube-api-access-r584d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:38 crc kubenswrapper[4669]: I1008 21:48:38.563191 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r584d\" (UniqueName: \"kubernetes.io/projected/cf5ed82e-7eae-4891-821c-619d6bca2d22-kube-api-access-r584d\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.238382 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb5f6ab07b353a9fb586ea12097058c13482052edc75ad6c36900dbf37b76a9a" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.238473 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-47rck" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.342828 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5ed82e-7eae-4891-821c-619d6bca2d22" path="/var/lib/kubelet/pods/cf5ed82e-7eae-4891-821c-619d6bca2d22/volumes" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.560680 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-gpp8w"] Oct 08 21:48:39 crc kubenswrapper[4669]: E1008 21:48:39.561038 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5ed82e-7eae-4891-821c-619d6bca2d22" containerName="container-00" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561057 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5ed82e-7eae-4891-821c-619d6bca2d22" containerName="container-00" Oct 08 21:48:39 crc kubenswrapper[4669]: E1008 21:48:39.561079 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="extract-utilities" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561087 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="extract-utilities" Oct 08 21:48:39 crc kubenswrapper[4669]: E1008 21:48:39.561102 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="extract-content" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561123 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="extract-content" Oct 08 21:48:39 crc kubenswrapper[4669]: E1008 21:48:39.561137 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="registry-server" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561143 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="registry-server" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561310 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="542ef893-30c0-42f1-b78e-3dfb07b93eec" containerName="registry-server" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561325 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5ed82e-7eae-4891-821c-619d6bca2d22" containerName="container-00" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.561942 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.563796 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8t5fb"/"default-dockercfg-qhdhb" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.681477 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbzh\" (UniqueName: \"kubernetes.io/projected/2389fad9-b9d8-4045-a61d-46cce29c5161-kube-api-access-5cbzh\") pod \"crc-debug-gpp8w\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.681574 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2389fad9-b9d8-4045-a61d-46cce29c5161-host\") pod \"crc-debug-gpp8w\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.782949 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2389fad9-b9d8-4045-a61d-46cce29c5161-host\") pod \"crc-debug-gpp8w\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.783178 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbzh\" (UniqueName: \"kubernetes.io/projected/2389fad9-b9d8-4045-a61d-46cce29c5161-kube-api-access-5cbzh\") pod \"crc-debug-gpp8w\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.783490 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2389fad9-b9d8-4045-a61d-46cce29c5161-host\") pod \"crc-debug-gpp8w\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.800275 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbzh\" (UniqueName: \"kubernetes.io/projected/2389fad9-b9d8-4045-a61d-46cce29c5161-kube-api-access-5cbzh\") pod \"crc-debug-gpp8w\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: I1008 21:48:39.876556 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:39 crc kubenswrapper[4669]: W1008 21:48:39.909911 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2389fad9_b9d8_4045_a61d_46cce29c5161.slice/crio-8eb803608358d9c0cf62d3ca0e50da4abf51d176c7cb5f20cc2ebdda44b28a16 WatchSource:0}: Error finding container 8eb803608358d9c0cf62d3ca0e50da4abf51d176c7cb5f20cc2ebdda44b28a16: Status 404 returned error can't find the container with id 8eb803608358d9c0cf62d3ca0e50da4abf51d176c7cb5f20cc2ebdda44b28a16 Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.248913 4669 generic.go:334] "Generic (PLEG): container finished" podID="2389fad9-b9d8-4045-a61d-46cce29c5161" containerID="605d3b2ccd2ba7404a32eede44c75f66a8a25edaab4305960d83fc5eb0ea8981" exitCode=0 Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.248958 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" event={"ID":"2389fad9-b9d8-4045-a61d-46cce29c5161","Type":"ContainerDied","Data":"605d3b2ccd2ba7404a32eede44c75f66a8a25edaab4305960d83fc5eb0ea8981"} Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.248988 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" event={"ID":"2389fad9-b9d8-4045-a61d-46cce29c5161","Type":"ContainerStarted","Data":"8eb803608358d9c0cf62d3ca0e50da4abf51d176c7cb5f20cc2ebdda44b28a16"} Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.249646 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.320732 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.498697 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tsxs4"] Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.691278 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-gpp8w"] Oct 08 21:48:40 crc kubenswrapper[4669]: I1008 21:48:40.699176 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-gpp8w"] Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.338149 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:48:41 crc kubenswrapper[4669]: E1008 21:48:41.338615 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw2kf_openshift-machine-config-operator(39c9bcf2-9580-4534-8c7e-886bd4aff469)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.358163 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.514855 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cbzh\" (UniqueName: \"kubernetes.io/projected/2389fad9-b9d8-4045-a61d-46cce29c5161-kube-api-access-5cbzh\") pod \"2389fad9-b9d8-4045-a61d-46cce29c5161\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.515358 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2389fad9-b9d8-4045-a61d-46cce29c5161-host\") pod \"2389fad9-b9d8-4045-a61d-46cce29c5161\" (UID: \"2389fad9-b9d8-4045-a61d-46cce29c5161\") " Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.515415 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2389fad9-b9d8-4045-a61d-46cce29c5161-host" (OuterVolumeSpecName: "host") pod "2389fad9-b9d8-4045-a61d-46cce29c5161" (UID: "2389fad9-b9d8-4045-a61d-46cce29c5161"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.515970 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2389fad9-b9d8-4045-a61d-46cce29c5161-host\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.521514 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2389fad9-b9d8-4045-a61d-46cce29c5161-kube-api-access-5cbzh" (OuterVolumeSpecName: "kube-api-access-5cbzh") pod "2389fad9-b9d8-4045-a61d-46cce29c5161" (UID: "2389fad9-b9d8-4045-a61d-46cce29c5161"). InnerVolumeSpecName "kube-api-access-5cbzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.618288 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cbzh\" (UniqueName: \"kubernetes.io/projected/2389fad9-b9d8-4045-a61d-46cce29c5161-kube-api-access-5cbzh\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.849257 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-k9t5v"] Oct 08 21:48:41 crc kubenswrapper[4669]: E1008 21:48:41.849601 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2389fad9-b9d8-4045-a61d-46cce29c5161" containerName="container-00" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.849618 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2389fad9-b9d8-4045-a61d-46cce29c5161" containerName="container-00" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.849826 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2389fad9-b9d8-4045-a61d-46cce29c5161" containerName="container-00" Oct 08 21:48:41 crc kubenswrapper[4669]: I1008 21:48:41.850439 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.025325 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-host\") pod \"crc-debug-k9t5v\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.025562 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rz7m\" (UniqueName: \"kubernetes.io/projected/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-kube-api-access-6rz7m\") pod \"crc-debug-k9t5v\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.127902 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rz7m\" (UniqueName: \"kubernetes.io/projected/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-kube-api-access-6rz7m\") pod \"crc-debug-k9t5v\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.128127 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-host\") pod \"crc-debug-k9t5v\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.128250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-host\") pod \"crc-debug-k9t5v\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.161995 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rz7m\" (UniqueName: \"kubernetes.io/projected/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-kube-api-access-6rz7m\") pod \"crc-debug-k9t5v\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.168877 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:42 crc kubenswrapper[4669]: W1008 21:48:42.197115 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode25e76e6_9ed5_4324_811f_11d69c9a8c0e.slice/crio-06bc2003d936f29ee8ff20bdc25a38463122ec376037ba112838e12844847a68 WatchSource:0}: Error finding container 06bc2003d936f29ee8ff20bdc25a38463122ec376037ba112838e12844847a68: Status 404 returned error can't find the container with id 06bc2003d936f29ee8ff20bdc25a38463122ec376037ba112838e12844847a68 Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.266078 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" event={"ID":"e25e76e6-9ed5-4324-811f-11d69c9a8c0e","Type":"ContainerStarted","Data":"06bc2003d936f29ee8ff20bdc25a38463122ec376037ba112838e12844847a68"} Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.268126 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-gpp8w" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.268190 4669 scope.go:117] "RemoveContainer" containerID="605d3b2ccd2ba7404a32eede44c75f66a8a25edaab4305960d83fc5eb0ea8981" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.268496 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tsxs4" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="registry-server" containerID="cri-o://e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2" gracePeriod=2 Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.842265 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.941515 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtpl8\" (UniqueName: \"kubernetes.io/projected/8e6ad776-df59-4a8c-9ee7-54120fe14a82-kube-api-access-dtpl8\") pod \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.941975 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-catalog-content\") pod \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.942036 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-utilities\") pod \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\" (UID: \"8e6ad776-df59-4a8c-9ee7-54120fe14a82\") " Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.943673 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-utilities" (OuterVolumeSpecName: "utilities") pod "8e6ad776-df59-4a8c-9ee7-54120fe14a82" (UID: "8e6ad776-df59-4a8c-9ee7-54120fe14a82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:42 crc kubenswrapper[4669]: I1008 21:48:42.949792 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6ad776-df59-4a8c-9ee7-54120fe14a82-kube-api-access-dtpl8" (OuterVolumeSpecName: "kube-api-access-dtpl8") pod "8e6ad776-df59-4a8c-9ee7-54120fe14a82" (UID: "8e6ad776-df59-4a8c-9ee7-54120fe14a82"). InnerVolumeSpecName "kube-api-access-dtpl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.038425 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e6ad776-df59-4a8c-9ee7-54120fe14a82" (UID: "8e6ad776-df59-4a8c-9ee7-54120fe14a82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.043767 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtpl8\" (UniqueName: \"kubernetes.io/projected/8e6ad776-df59-4a8c-9ee7-54120fe14a82-kube-api-access-dtpl8\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.043810 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.043827 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ad776-df59-4a8c-9ee7-54120fe14a82-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.283920 4669 generic.go:334] "Generic (PLEG): container finished" podID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerID="e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2" exitCode=0 Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.284017 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerDied","Data":"e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2"} Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.284052 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tsxs4" event={"ID":"8e6ad776-df59-4a8c-9ee7-54120fe14a82","Type":"ContainerDied","Data":"21597b9563324dfdccede2f6988495acc6c4cfb0e0997ef50eafecf5e6bfe09f"} Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.284033 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tsxs4" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.284078 4669 scope.go:117] "RemoveContainer" containerID="e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.285955 4669 generic.go:334] "Generic (PLEG): container finished" podID="e25e76e6-9ed5-4324-811f-11d69c9a8c0e" containerID="95a2f02fa422ee0a00cfd71755fa9295b0ef808f8a8ed824b0a8d48cd3ae5841" exitCode=0 Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.286000 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" event={"ID":"e25e76e6-9ed5-4324-811f-11d69c9a8c0e","Type":"ContainerDied","Data":"95a2f02fa422ee0a00cfd71755fa9295b0ef808f8a8ed824b0a8d48cd3ae5841"} Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.329493 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-k9t5v"] Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.344049 4669 scope.go:117] "RemoveContainer" containerID="9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.357012 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2389fad9-b9d8-4045-a61d-46cce29c5161" path="/var/lib/kubelet/pods/2389fad9-b9d8-4045-a61d-46cce29c5161/volumes" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.358247 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8t5fb/crc-debug-k9t5v"] Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.358307 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tsxs4"] Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.358326 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tsxs4"] Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.369154 4669 scope.go:117] "RemoveContainer" containerID="84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.417276 4669 scope.go:117] "RemoveContainer" containerID="e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2" Oct 08 21:48:43 crc kubenswrapper[4669]: E1008 21:48:43.418977 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2\": container with ID starting with e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2 not found: ID does not exist" containerID="e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.419043 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2"} err="failed to get container status \"e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2\": rpc error: code = NotFound desc = could not find container \"e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2\": container with ID starting with e0b2caf0d2b7c35c7d0c16f7ce57c8b94184bb4b225bcae8bd296c72b6fcd3d2 not found: ID does not exist" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.419071 4669 scope.go:117] "RemoveContainer" containerID="9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721" Oct 08 21:48:43 crc kubenswrapper[4669]: E1008 21:48:43.419738 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721\": container with ID starting with 9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721 not found: ID does not exist" containerID="9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.419769 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721"} err="failed to get container status \"9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721\": rpc error: code = NotFound desc = could not find container \"9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721\": container with ID starting with 9e7736e3b113a9cd3a838a36c20c5fe4a936cc236c0bd6f88f190b816ab65721 not found: ID does not exist" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.419784 4669 scope.go:117] "RemoveContainer" containerID="84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6" Oct 08 21:48:43 crc kubenswrapper[4669]: E1008 21:48:43.424293 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6\": container with ID starting with 84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6 not found: ID does not exist" containerID="84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6" Oct 08 21:48:43 crc kubenswrapper[4669]: I1008 21:48:43.424338 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6"} err="failed to get container status \"84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6\": rpc error: code = NotFound desc = could not find container \"84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6\": container with ID starting with 84b907b8ff2ca5854818a80a75b22765bd5849e51f3505643e62ea34a9b167d6 not found: ID does not exist" Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.387354 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.472213 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rz7m\" (UniqueName: \"kubernetes.io/projected/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-kube-api-access-6rz7m\") pod \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.472487 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-host\") pod \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\" (UID: \"e25e76e6-9ed5-4324-811f-11d69c9a8c0e\") " Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.472643 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-host" (OuterVolumeSpecName: "host") pod "e25e76e6-9ed5-4324-811f-11d69c9a8c0e" (UID: "e25e76e6-9ed5-4324-811f-11d69c9a8c0e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.473169 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-host\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.476988 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-kube-api-access-6rz7m" (OuterVolumeSpecName: "kube-api-access-6rz7m") pod "e25e76e6-9ed5-4324-811f-11d69c9a8c0e" (UID: "e25e76e6-9ed5-4324-811f-11d69c9a8c0e"). InnerVolumeSpecName "kube-api-access-6rz7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:44 crc kubenswrapper[4669]: I1008 21:48:44.574573 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rz7m\" (UniqueName: \"kubernetes.io/projected/e25e76e6-9ed5-4324-811f-11d69c9a8c0e-kube-api-access-6rz7m\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:45 crc kubenswrapper[4669]: I1008 21:48:45.307122 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06bc2003d936f29ee8ff20bdc25a38463122ec376037ba112838e12844847a68" Oct 08 21:48:45 crc kubenswrapper[4669]: I1008 21:48:45.307192 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/crc-debug-k9t5v" Oct 08 21:48:45 crc kubenswrapper[4669]: I1008 21:48:45.356349 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" path="/var/lib/kubelet/pods/8e6ad776-df59-4a8c-9ee7-54120fe14a82/volumes" Oct 08 21:48:45 crc kubenswrapper[4669]: I1008 21:48:45.358435 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25e76e6-9ed5-4324-811f-11d69c9a8c0e" path="/var/lib/kubelet/pods/e25e76e6-9ed5-4324-811f-11d69c9a8c0e/volumes" Oct 08 21:48:54 crc kubenswrapper[4669]: I1008 21:48:54.330421 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:48:55 crc kubenswrapper[4669]: I1008 21:48:55.421907 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"fd62feb27381ee2613e0b172d6c6d469c4cfe6dca045c2e24055cdd635324b18"} Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.315090 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbf49878d-2r5dt_6311b31e-a85f-4bc0-9c1a-254c1650ef17/barbican-api/0.log" Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.419926 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbf49878d-2r5dt_6311b31e-a85f-4bc0-9c1a-254c1650ef17/barbican-api-log/0.log" Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.540173 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69d49dbff8-2tq27_b1994a50-3452-488f-b364-1b1377cfd62d/barbican-keystone-listener/0.log" Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.564250 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69d49dbff8-2tq27_b1994a50-3452-488f-b364-1b1377cfd62d/barbican-keystone-listener-log/0.log" Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.692032 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8596cf69cc-5lmdk_ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c/barbican-worker/0.log" Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.736292 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8596cf69cc-5lmdk_ae3a1ea5-4d35-4e2a-b87b-e393bf16b90c/barbican-worker-log/0.log" Oct 08 21:48:59 crc kubenswrapper[4669]: I1008 21:48:59.827027 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s69cr_d54b9af7-032e-4b63-ada5-0cebab9e052d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.029925 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/ceilometer-notification-agent/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.053932 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/ceilometer-central-agent/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.076732 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/proxy-httpd/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.139375 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e6cd0ba-8231-4bc6-bab5-83f4b8740c01/sg-core/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.278806 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_42082ba5-0485-486f-8b1c-17cf7c0fc405/cinder-api/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.289496 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_42082ba5-0485-486f-8b1c-17cf7c0fc405/cinder-api-log/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.510921 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3873e946-b682-46b0-9b31-c34217bed686/probe/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.544564 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3873e946-b682-46b0-9b31-c34217bed686/cinder-scheduler/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.585805 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6gzjx_e84616ce-4d73-4f8f-85b3-cca04e509792/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.757932 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6xbff_1d09eff3-6572-462c-91db-4a1f5f167eae/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.826172 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-tgph4_e78c98eb-ee71-4877-92f0-edfa9eaca8e5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:00 crc kubenswrapper[4669]: I1008 21:49:00.968162 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-czgxg_2626e058-7115-4198-91ab-19e6f98dfc89/init/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.152225 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-czgxg_2626e058-7115-4198-91ab-19e6f98dfc89/dnsmasq-dns/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.170557 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-czgxg_2626e058-7115-4198-91ab-19e6f98dfc89/init/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.180667 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2d5kw_8f08a947-ff60-4018-80f6-0098a257eddf/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.336814 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e/glance-log/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.388391 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8b41c8f6-5c23-40d7-9e16-d21a6faf4c0e/glance-httpd/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.502994 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7c63c33c-d91e-49b9-8b85-50960824149b/glance-log/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.537817 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7c63c33c-d91e-49b9-8b85-50960824149b/glance-httpd/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.671661 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fc57f5668-z5dzm_43e0f642-1a58-481b-8347-b4d29176ddc5/horizon/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.906951 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-72m9r_296bfc36-ce85-4db3-a692-acf7edf869b1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:01 crc kubenswrapper[4669]: I1008 21:49:01.999937 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hk4fz_30bc939f-3290-4c46-8d00-120c0bf33951/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.145102 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fc57f5668-z5dzm_43e0f642-1a58-481b-8347-b4d29176ddc5/horizon-log/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.307592 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6877c668b7-ws6lk_d35cc358-f26a-4e29-a61e-cf7e82c331a7/keystone-api/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.333977 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8af63aa0-e5df-488c-b5e4-4677c9d0f2de/kube-state-metrics/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.471320 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ph9v2_f0494a3d-36c9-4d26-8f15-c1780af52f46/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.706427 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f968c55b5-frgnz_1e273869-5ed1-48c3-af8f-d4d2df61c9e7/neutron-api/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.799701 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f968c55b5-frgnz_1e273869-5ed1-48c3-af8f-d4d2df61c9e7/neutron-httpd/0.log" Oct 08 21:49:02 crc kubenswrapper[4669]: I1008 21:49:02.935136 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-954pg_2c0c5d80-cf44-45bb-847d-839dc3fd8887/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:03 crc kubenswrapper[4669]: I1008 21:49:03.339382 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26677aac-fbac-4ec9-972c-e22c276549f2/nova-api-log/0.log" Oct 08 21:49:03 crc kubenswrapper[4669]: I1008 21:49:03.437682 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eb3e439c-741b-4d40-a85e-1f6da93a485c/nova-cell0-conductor-conductor/0.log" Oct 08 21:49:03 crc kubenswrapper[4669]: I1008 21:49:03.754839 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7c4f1ade-7c5a-45f6-8b72-33a192186209/nova-cell1-conductor-conductor/0.log" Oct 08 21:49:03 crc kubenswrapper[4669]: I1008 21:49:03.830969 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_915f256e-a280-4726-8987-df1df9f8e4b5/nova-cell1-novncproxy-novncproxy/0.log" Oct 08 21:49:03 crc kubenswrapper[4669]: I1008 21:49:03.858844 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26677aac-fbac-4ec9-972c-e22c276549f2/nova-api-api/0.log" Oct 08 21:49:03 crc kubenswrapper[4669]: I1008 21:49:03.998629 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-tj7fq_9500e0b9-017f-4e4c-b72c-cbe0c98f7660/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:04 crc kubenswrapper[4669]: I1008 21:49:04.186356 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f57d146-c85b-4beb-a614-1ea878e175b4/nova-metadata-log/0.log" Oct 08 21:49:04 crc kubenswrapper[4669]: I1008 21:49:04.493094 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_53fa562b-eb62-487e-8b82-3da0799fae19/mysql-bootstrap/0.log" Oct 08 21:49:04 crc kubenswrapper[4669]: I1008 21:49:04.534361 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_89cdc120-36f5-4203-b064-4300a8249a64/nova-scheduler-scheduler/0.log" Oct 08 21:49:04 crc kubenswrapper[4669]: I1008 21:49:04.670454 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_53fa562b-eb62-487e-8b82-3da0799fae19/mysql-bootstrap/0.log" Oct 08 21:49:04 crc kubenswrapper[4669]: I1008 21:49:04.712691 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_53fa562b-eb62-487e-8b82-3da0799fae19/galera/0.log" Oct 08 21:49:04 crc kubenswrapper[4669]: I1008 21:49:04.872676 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33dac706-1170-45a3-8151-a6ee9bce8005/mysql-bootstrap/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.073701 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33dac706-1170-45a3-8151-a6ee9bce8005/mysql-bootstrap/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.079301 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33dac706-1170-45a3-8151-a6ee9bce8005/galera/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.282348 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2aa3bf86-2604-4d46-bc73-13b5d049b01c/openstackclient/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.352901 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2mvd2_2e96f6f2-b5f3-49e7-8d84-15d5535963a2/ovn-controller/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.423706 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f57d146-c85b-4beb-a614-1ea878e175b4/nova-metadata-metadata/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.478574 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l4snx_2bc4b231-fe5a-4712-855f-3d56029e240b/openstack-network-exporter/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.628652 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovsdb-server-init/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.821618 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovsdb-server-init/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.831224 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovsdb-server/0.log" Oct 08 21:49:05 crc kubenswrapper[4669]: I1008 21:49:05.876989 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wnkk4_d1b7a290-4c6b-48ff-b23b-7fe2ba609dcc/ovs-vswitchd/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.067195 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nlnn5_eae3dd15-c997-43e0-8362-8a9210634436/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.247378 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_685f13e1-1e56-46fb-b0b4-d850050411d7/openstack-network-exporter/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.309695 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_685f13e1-1e56-46fb-b0b4-d850050411d7/ovn-northd/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.419589 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_70b1241c-d97a-4af6-9c95-dadad197012e/openstack-network-exporter/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.515327 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_70b1241c-d97a-4af6-9c95-dadad197012e/ovsdbserver-nb/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.646820 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_07bcdc9f-6970-49af-8620-63f8ed43845b/openstack-network-exporter/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.693097 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_07bcdc9f-6970-49af-8620-63f8ed43845b/ovsdbserver-sb/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.807341 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f47799b5d-n27h7_9076c9d7-726e-4b80-80af-a78887da72d1/placement-api/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.958331 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f47799b5d-n27h7_9076c9d7-726e-4b80-80af-a78887da72d1/placement-log/0.log" Oct 08 21:49:06 crc kubenswrapper[4669]: I1008 21:49:06.973929 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8e35f189-cd14-4892-a4d6-25a23a2ae04c/setup-container/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.182742 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8e35f189-cd14-4892-a4d6-25a23a2ae04c/rabbitmq/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.206377 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8e35f189-cd14-4892-a4d6-25a23a2ae04c/setup-container/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.258355 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0bfeeb02-715e-4358-802c-ce7ed6721a30/setup-container/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.446757 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0bfeeb02-715e-4358-802c-ce7ed6721a30/setup-container/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.464120 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0bfeeb02-715e-4358-802c-ce7ed6721a30/rabbitmq/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.511774 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sxvrk_ffbc24f2-90eb-4f42-b2aa-0290921dbb79/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.712969 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-b5rp4_405955bf-c08c-4afd-9720-41adf4bebd19/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.743381 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n4czl_0bff2321-3d96-47bf-815e-7ab3cea9563a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.918465 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m6tgw_f598594a-891b-4339-99d1-e10f7c3844af/ssh-known-hosts-edpm-deployment/0.log" Oct 08 21:49:07 crc kubenswrapper[4669]: I1008 21:49:07.944978 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-cl595_2557fef7-913e-4188-93e7-4a60c4b4c918/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.185861 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f9445f759-bx7xs_895cb1ba-c212-4908-82e9-f5042a50686f/proxy-server/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.300223 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f9445f759-bx7xs_895cb1ba-c212-4908-82e9-f5042a50686f/proxy-httpd/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.424712 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-58zf2_f1096bd8-53d4-4403-9abc-dd7a2c91c1e6/swift-ring-rebalance/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.501042 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-auditor/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.557583 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-reaper/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.668964 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-server/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.687374 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/account-replicator/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.710366 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-auditor/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.775909 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-replicator/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.872800 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-updater/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.878918 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/container-server/0.log" Oct 08 21:49:08 crc kubenswrapper[4669]: I1008 21:49:08.995219 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-expirer/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.031008 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-auditor/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.054218 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-replicator/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.121730 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-server/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.210345 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/object-updater/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.261820 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/rsync/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.274889 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_efef408f-7f0a-4eb1-a9f8-288a9606bf84/swift-recon-cron/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.463560 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qjgxb_bb78e48d-ddf1-494e-883f-d9987d2f0f0a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.506336 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ad5f7082-536e-477e-a8a3-b5c4945b3b87/tempest-tests-tempest-tests-runner/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.666200 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_919e000b-619f-4e18-b6f6-4473d23718c9/test-operator-logs-container/0.log" Oct 08 21:49:09 crc kubenswrapper[4669]: I1008 21:49:09.731426 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6xtnr_d7d06c6d-e606-429a-b7e6-e6e2609b3b4e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 21:49:19 crc kubenswrapper[4669]: I1008 21:49:19.866373 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cb42938c-1da6-4d93-b7be-ff78d294ebf1/memcached/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.363692 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/util/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.515793 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/pull/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.550896 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/util/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.567349 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/pull/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.703407 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/pull/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.715287 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/util/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.717658 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a759e5b31ccacf77cec3bae7360d28b9da09ce26b4a078cf08f610eefvtpmh_a6266c66-d403-46e9-ad99-54beedd6adf4/extract/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.847109 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rtmfb_5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5/kube-rbac-proxy/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.943165 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rtmfb_5317b6da-a1f3-4a2c-85e4-3ef8fcb018d5/manager/0.log" Oct 08 21:49:33 crc kubenswrapper[4669]: I1008 21:49:33.976732 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-6sdlr_b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45/kube-rbac-proxy/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.066397 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-6sdlr_b7c50f2e-fa5d-4b03-be3a-cfd5ecc63b45/manager/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.127973 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-m9v76_5f0a0ada-acd3-452b-bd50-b5d634b906c4/kube-rbac-proxy/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.171919 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-m9v76_5f0a0ada-acd3-452b-bd50-b5d634b906c4/manager/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.317712 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pjtc2_dc0d4c88-6c32-4498-8025-de3c8b59eaea/kube-rbac-proxy/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.354254 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pjtc2_dc0d4c88-6c32-4498-8025-de3c8b59eaea/manager/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.462325 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-w2mfj_7185d03c-648b-488d-b1d0-842f8b72e0ff/kube-rbac-proxy/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.515619 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-w2mfj_7185d03c-648b-488d-b1d0-842f8b72e0ff/manager/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.558762 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-mpbbj_0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a/kube-rbac-proxy/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.654660 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-mpbbj_0a7f72b2-7ef9-492e-ada0-a7e64d8fcb7a/manager/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.695861 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-c2w55_01d6ae8d-9f65-4a30-80fb-135e4eba5a10/kube-rbac-proxy/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.866438 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-c2w55_01d6ae8d-9f65-4a30-80fb-135e4eba5a10/manager/0.log" Oct 08 21:49:34 crc kubenswrapper[4669]: I1008 21:49:34.907353 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-hwmh2_f92c3530-f73f-45fe-84f5-bea451e1aaba/kube-rbac-proxy/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.000330 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-hwmh2_f92c3530-f73f-45fe-84f5-bea451e1aaba/manager/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.174432 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6mn5l_b43bc083-637e-4c93-a024-a47cecaade29/kube-rbac-proxy/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.307542 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6mn5l_b43bc083-637e-4c93-a024-a47cecaade29/manager/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.400281 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-frtrx_4ee291fa-b998-4bfc-a689-fb66e345bcaa/kube-rbac-proxy/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.475169 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-frtrx_4ee291fa-b998-4bfc-a689-fb66e345bcaa/manager/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.536039 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-5rs85_1af3f5fc-da55-4c9e-ac87-30363f6cf741/kube-rbac-proxy/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.637007 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-5rs85_1af3f5fc-da55-4c9e-ac87-30363f6cf741/manager/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.785569 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-vl7tc_a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4/kube-rbac-proxy/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.879850 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-vl7tc_a2b9cab5-b86d-4d9b-aa7c-d035d04d40a4/manager/0.log" Oct 08 21:49:35 crc kubenswrapper[4669]: I1008 21:49:35.940424 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-pb4g2_5fceefe8-bf0f-4f2d-9e14-2208f38b73d7/kube-rbac-proxy/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.085597 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-pb4g2_5fceefe8-bf0f-4f2d-9e14-2208f38b73d7/manager/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.091237 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-9vh5w_d4ee300d-b78b-4052-83a9-4ab8ca569886/kube-rbac-proxy/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.096369 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-9vh5w_d4ee300d-b78b-4052-83a9-4ab8ca569886/manager/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.232941 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8_d44ffb7b-c761-48b2-be7c-a5af13e2a59b/kube-rbac-proxy/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.254298 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757ddwpl8_d44ffb7b-c761-48b2-be7c-a5af13e2a59b/manager/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.396238 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dd9d44468-66k2c_eb2409f1-4af4-49a3-a453-29e8f447360e/kube-rbac-proxy/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.537693 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5bb56b84bf-j6scr_6f26e0f8-20e5-4f7f-8e25-c17579e57b28/kube-rbac-proxy/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.713631 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5bb56b84bf-j6scr_6f26e0f8-20e5-4f7f-8e25-c17579e57b28/operator/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.754892 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hd2hf_40ec54d4-3186-4a11-a533-b7edc48914b3/registry-server/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.922006 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-gc5qn_92d7bcb6-b09e-4605-87a9-9cdaedb40c74/kube-rbac-proxy/0.log" Oct 08 21:49:36 crc kubenswrapper[4669]: I1008 21:49:36.977572 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-gc5qn_92d7bcb6-b09e-4605-87a9-9cdaedb40c74/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.005765 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-bjfjf_d4193662-5c4e-40fa-ac9e-495509e75c4a/kube-rbac-proxy/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.179087 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-bjfjf_d4193662-5c4e-40fa-ac9e-495509e75c4a/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.214519 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-h7k72_3dce5b40-fa36-4a03-bea2-a19e6267ecec/operator/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.380252 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-m6ldz_46a0519d-251f-48e0-9d65-4ca08c627195/kube-rbac-proxy/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.438090 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-m6ldz_46a0519d-251f-48e0-9d65-4ca08c627195/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.502049 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dd9d44468-66k2c_eb2409f1-4af4-49a3-a453-29e8f447360e/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.545254 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-775776c574-h2f5k_58991c0e-b29b-4851-b5b0-8327380e1320/kube-rbac-proxy/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.620135 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-775776c574-h2f5k_58991c0e-b29b-4851-b5b0-8327380e1320/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.665457 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-tw98s_19754c8c-8fbf-4b8e-b673-462e22ec11d1/kube-rbac-proxy/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.685314 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-tw98s_19754c8c-8fbf-4b8e-b673-462e22ec11d1/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.792928 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-qfbh5_76773a34-db21-4354-a16e-e70ea0d6d63d/manager/0.log" Oct 08 21:49:37 crc kubenswrapper[4669]: I1008 21:49:37.794401 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-qfbh5_76773a34-db21-4354-a16e-e70ea0d6d63d/kube-rbac-proxy/0.log" Oct 08 21:49:52 crc kubenswrapper[4669]: I1008 21:49:52.670899 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9vmlr_d0da4600-abf4-4f3e-8299-a269b29ca44a/control-plane-machine-set-operator/0.log" Oct 08 21:49:52 crc kubenswrapper[4669]: I1008 21:49:52.827755 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rtm6r_e45ff91e-f07d-489b-b0a0-8e815bdf41c3/kube-rbac-proxy/0.log" Oct 08 21:49:52 crc kubenswrapper[4669]: I1008 21:49:52.900688 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rtm6r_e45ff91e-f07d-489b-b0a0-8e815bdf41c3/machine-api-operator/0.log" Oct 08 21:50:05 crc kubenswrapper[4669]: I1008 21:50:05.148961 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-btc2c_7a8b2c5d-38a2-4866-a1e0-8df9b4659c15/cert-manager-controller/0.log" Oct 08 21:50:05 crc kubenswrapper[4669]: I1008 21:50:05.225196 4669 scope.go:117] "RemoveContainer" containerID="459727e7cd3e7eac5fcf8bd1cb7d104c40abe70929c3a3336ea4dfb6c377b195" Oct 08 21:50:05 crc kubenswrapper[4669]: I1008 21:50:05.361059 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-7g9zf_c726284c-5ca2-4d63-b96e-56c1aa537986/cert-manager-cainjector/0.log" Oct 08 21:50:05 crc kubenswrapper[4669]: I1008 21:50:05.448727 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9flwc_314fd865-0086-4bd8-8a90-0c992557a6af/cert-manager-webhook/0.log" Oct 08 21:50:17 crc kubenswrapper[4669]: I1008 21:50:17.102636 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-4ddnj_e4bf4b08-2e16-4d0a-8cc0-47cff74f5e53/nmstate-console-plugin/0.log" Oct 08 21:50:17 crc kubenswrapper[4669]: I1008 21:50:17.280611 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dr78d_393d89be-66bc-40a8-99e2-a145ec3eebe8/nmstate-handler/0.log" Oct 08 21:50:17 crc kubenswrapper[4669]: I1008 21:50:17.289700 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-nqf8s_e975e2d7-d104-4d94-a624-e2bd680e2e23/kube-rbac-proxy/0.log" Oct 08 21:50:17 crc kubenswrapper[4669]: I1008 21:50:17.384520 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-nqf8s_e975e2d7-d104-4d94-a624-e2bd680e2e23/nmstate-metrics/0.log" Oct 08 21:50:17 crc kubenswrapper[4669]: I1008 21:50:17.479688 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bmpqk_e3e2cb2d-8c68-46d0-a639-fd839c30a680/nmstate-operator/0.log" Oct 08 21:50:17 crc kubenswrapper[4669]: I1008 21:50:17.582423 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-57cwq_d67c051f-00e6-45c7-aa07-401695d5798f/nmstate-webhook/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.020040 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-jlbrt_80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6/kube-rbac-proxy/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.134557 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-jlbrt_80c87dc9-d3cb-455c-b0bc-c9ae5a0cead6/controller/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.277583 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.389976 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.394697 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.400008 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.458091 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.661282 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.663381 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.685830 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.712248 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.848275 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-frr-files/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.863871 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-metrics/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.874250 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/controller/0.log" Oct 08 21:50:31 crc kubenswrapper[4669]: I1008 21:50:31.898348 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/cp-reloader/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.027665 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/frr-metrics/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.078882 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/kube-rbac-proxy/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.105179 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/kube-rbac-proxy-frr/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.245473 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/reloader/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.304758 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-gwhr7_c74dd47d-12ee-4626-a02f-e8dc07f26791/frr-k8s-webhook-server/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.517866 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6f6598d79f-8tfml_0304e30c-72d1-4544-9f05-fb7acc1c3c61/manager/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.706071 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b6f8f8d99-l255w_dbe12456-371e-4274-911b-264b27260e4e/webhook-server/0.log" Oct 08 21:50:32 crc kubenswrapper[4669]: I1008 21:50:32.731104 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x4jbl_91c83ac7-3c70-4fca-85b1-9ee4d9dd4568/kube-rbac-proxy/0.log" Oct 08 21:50:33 crc kubenswrapper[4669]: I1008 21:50:33.359708 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x4jbl_91c83ac7-3c70-4fca-85b1-9ee4d9dd4568/speaker/0.log" Oct 08 21:50:33 crc kubenswrapper[4669]: I1008 21:50:33.457943 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jq9jv_1de0da7b-1591-4af2-bac9-241428020fe9/frr/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.369390 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/util/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.598127 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/pull/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.613327 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/util/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.650189 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/pull/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.849230 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/extract/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.856189 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/util/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.893388 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2n94nt_a10fc4a6-e10c-481f-8547-cf5d9669d34d/pull/0.log" Oct 08 21:50:45 crc kubenswrapper[4669]: I1008 21:50:45.992945 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-utilities/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.165352 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-utilities/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.169798 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-content/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.182132 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-content/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.330015 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-utilities/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.336951 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/extract-content/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.515080 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-utilities/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.608945 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4w5jl_ae6c5190-e7d0-4384-910d-620da4746aa0/registry-server/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.746617 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-content/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.760831 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-utilities/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.786804 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-content/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.930777 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-utilities/0.log" Oct 08 21:50:46 crc kubenswrapper[4669]: I1008 21:50:46.937066 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/extract-content/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.134966 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/util/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.311752 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/pull/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.334641 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/util/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.448520 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/pull/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.635016 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/util/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.652023 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/pull/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.670044 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2hbpl_29a7cdfe-bc26-4687-9359-e829d21b4137/extract/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.682142 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxfn2_250ce6ef-e5b8-4912-8037-85a0c520ff7a/registry-server/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.834107 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kjxjv_c10f6dc4-f608-4960-91ab-cfdebb79b8ff/marketplace-operator/0.log" Oct 08 21:50:47 crc kubenswrapper[4669]: I1008 21:50:47.889754 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-utilities/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.029036 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-utilities/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.034293 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-content/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.047315 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-content/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.206852 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-utilities/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.243467 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/extract-content/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.269543 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-utilities/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.329929 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-558fm_065b510c-2a5d-4110-80b9-69865b532686/registry-server/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.453122 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-content/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.454274 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-utilities/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.486317 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-content/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.652808 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-content/0.log" Oct 08 21:50:48 crc kubenswrapper[4669]: I1008 21:50:48.656447 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/extract-utilities/0.log" Oct 08 21:50:49 crc kubenswrapper[4669]: I1008 21:50:49.266852 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4rszc_bca52d11-f041-40bc-b352-1e820a410996/registry-server/0.log" Oct 08 21:51:13 crc kubenswrapper[4669]: I1008 21:51:13.185683 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:51:13 crc kubenswrapper[4669]: I1008 21:51:13.186252 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:51:43 crc kubenswrapper[4669]: I1008 21:51:43.185188 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:51:43 crc kubenswrapper[4669]: I1008 21:51:43.185959 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.626166 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fp44c"] Oct 08 21:51:58 crc kubenswrapper[4669]: E1008 21:51:58.627294 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="extract-utilities" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.627310 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="extract-utilities" Oct 08 21:51:58 crc kubenswrapper[4669]: E1008 21:51:58.627335 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="extract-content" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.627344 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="extract-content" Oct 08 21:51:58 crc kubenswrapper[4669]: E1008 21:51:58.627367 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25e76e6-9ed5-4324-811f-11d69c9a8c0e" containerName="container-00" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.627376 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25e76e6-9ed5-4324-811f-11d69c9a8c0e" containerName="container-00" Oct 08 21:51:58 crc kubenswrapper[4669]: E1008 21:51:58.627390 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="registry-server" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.627399 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="registry-server" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.627666 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25e76e6-9ed5-4324-811f-11d69c9a8c0e" containerName="container-00" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.627693 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6ad776-df59-4a8c-9ee7-54120fe14a82" containerName="registry-server" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.629343 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.641641 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp44c"] Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.774921 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-catalog-content\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.775043 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nfn\" (UniqueName: \"kubernetes.io/projected/4be6310c-20e0-4048-a5fe-16cb852489e4-kube-api-access-q6nfn\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.775135 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-utilities\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.876729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-catalog-content\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.876824 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nfn\" (UniqueName: \"kubernetes.io/projected/4be6310c-20e0-4048-a5fe-16cb852489e4-kube-api-access-q6nfn\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.876911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-utilities\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.877500 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-utilities\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.877810 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-catalog-content\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.896491 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nfn\" (UniqueName: \"kubernetes.io/projected/4be6310c-20e0-4048-a5fe-16cb852489e4-kube-api-access-q6nfn\") pod \"redhat-marketplace-fp44c\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:58 crc kubenswrapper[4669]: I1008 21:51:58.957566 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:51:59 crc kubenswrapper[4669]: I1008 21:51:59.437239 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp44c"] Oct 08 21:52:00 crc kubenswrapper[4669]: I1008 21:52:00.162454 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerStarted","Data":"642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb"} Oct 08 21:52:00 crc kubenswrapper[4669]: I1008 21:52:00.164137 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerStarted","Data":"5c4e16dbe6def9d795761a58edc2c0cfe0a82ba140d77ca2a142bc5b23fb03ad"} Oct 08 21:52:01 crc kubenswrapper[4669]: I1008 21:52:01.182941 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerDied","Data":"642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb"} Oct 08 21:52:01 crc kubenswrapper[4669]: I1008 21:52:01.182793 4669 generic.go:334] "Generic (PLEG): container finished" podID="4be6310c-20e0-4048-a5fe-16cb852489e4" containerID="642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb" exitCode=0 Oct 08 21:52:03 crc kubenswrapper[4669]: I1008 21:52:03.210960 4669 generic.go:334] "Generic (PLEG): container finished" podID="4be6310c-20e0-4048-a5fe-16cb852489e4" containerID="b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26" exitCode=0 Oct 08 21:52:03 crc kubenswrapper[4669]: I1008 21:52:03.211057 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerDied","Data":"b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26"} Oct 08 21:52:04 crc kubenswrapper[4669]: I1008 21:52:04.223278 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerStarted","Data":"ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da"} Oct 08 21:52:04 crc kubenswrapper[4669]: I1008 21:52:04.254781 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fp44c" podStartSLOduration=3.811576932 podStartE2EDuration="6.25475338s" podCreationTimestamp="2025-10-08 21:51:58 +0000 UTC" firstStartedPulling="2025-10-08 21:52:01.185721443 +0000 UTC m=+4040.878532126" lastFinishedPulling="2025-10-08 21:52:03.628897901 +0000 UTC m=+4043.321708574" observedRunningTime="2025-10-08 21:52:04.246746068 +0000 UTC m=+4043.939556751" watchObservedRunningTime="2025-10-08 21:52:04.25475338 +0000 UTC m=+4043.947564083" Oct 08 21:52:08 crc kubenswrapper[4669]: I1008 21:52:08.958149 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:52:08 crc kubenswrapper[4669]: I1008 21:52:08.960579 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:52:09 crc kubenswrapper[4669]: I1008 21:52:09.035081 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:52:09 crc kubenswrapper[4669]: I1008 21:52:09.345496 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:52:09 crc kubenswrapper[4669]: I1008 21:52:09.413649 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp44c"] Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.298167 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fp44c" podUID="4be6310c-20e0-4048-a5fe-16cb852489e4" containerName="registry-server" containerID="cri-o://ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da" gracePeriod=2 Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.784276 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.841830 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nfn\" (UniqueName: \"kubernetes.io/projected/4be6310c-20e0-4048-a5fe-16cb852489e4-kube-api-access-q6nfn\") pod \"4be6310c-20e0-4048-a5fe-16cb852489e4\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.841997 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-utilities\") pod \"4be6310c-20e0-4048-a5fe-16cb852489e4\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.842061 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-catalog-content\") pod \"4be6310c-20e0-4048-a5fe-16cb852489e4\" (UID: \"4be6310c-20e0-4048-a5fe-16cb852489e4\") " Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.842901 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-utilities" (OuterVolumeSpecName: "utilities") pod "4be6310c-20e0-4048-a5fe-16cb852489e4" (UID: "4be6310c-20e0-4048-a5fe-16cb852489e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.847644 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be6310c-20e0-4048-a5fe-16cb852489e4-kube-api-access-q6nfn" (OuterVolumeSpecName: "kube-api-access-q6nfn") pod "4be6310c-20e0-4048-a5fe-16cb852489e4" (UID: "4be6310c-20e0-4048-a5fe-16cb852489e4"). InnerVolumeSpecName "kube-api-access-q6nfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.877245 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4be6310c-20e0-4048-a5fe-16cb852489e4" (UID: "4be6310c-20e0-4048-a5fe-16cb852489e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.944851 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.945171 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nfn\" (UniqueName: \"kubernetes.io/projected/4be6310c-20e0-4048-a5fe-16cb852489e4-kube-api-access-q6nfn\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:11 crc kubenswrapper[4669]: I1008 21:52:11.945265 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be6310c-20e0-4048-a5fe-16cb852489e4-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.310498 4669 generic.go:334] "Generic (PLEG): container finished" podID="4be6310c-20e0-4048-a5fe-16cb852489e4" containerID="ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da" exitCode=0 Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.310566 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerDied","Data":"ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da"} Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.310590 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp44c" event={"ID":"4be6310c-20e0-4048-a5fe-16cb852489e4","Type":"ContainerDied","Data":"5c4e16dbe6def9d795761a58edc2c0cfe0a82ba140d77ca2a142bc5b23fb03ad"} Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.310608 4669 scope.go:117] "RemoveContainer" containerID="ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.310715 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp44c" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.364887 4669 scope.go:117] "RemoveContainer" containerID="b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.372082 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp44c"] Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.387491 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp44c"] Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.393301 4669 scope.go:117] "RemoveContainer" containerID="642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.447810 4669 scope.go:117] "RemoveContainer" containerID="ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da" Oct 08 21:52:12 crc kubenswrapper[4669]: E1008 21:52:12.448638 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da\": container with ID starting with ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da not found: ID does not exist" containerID="ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.448686 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da"} err="failed to get container status \"ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da\": rpc error: code = NotFound desc = could not find container \"ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da\": container with ID starting with ff2b643cfb7ae20e9b88b8d678bb5ffac9f8bc6c894c148689127f232d45a5da not found: ID does not exist" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.448720 4669 scope.go:117] "RemoveContainer" containerID="b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26" Oct 08 21:52:12 crc kubenswrapper[4669]: E1008 21:52:12.449986 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26\": container with ID starting with b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26 not found: ID does not exist" containerID="b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.450018 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26"} err="failed to get container status \"b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26\": rpc error: code = NotFound desc = could not find container \"b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26\": container with ID starting with b9e87d44e744079f46659f885b0aaa0eb60b5a1559f6a0b3e3cdd0e7e51ece26 not found: ID does not exist" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.450037 4669 scope.go:117] "RemoveContainer" containerID="642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb" Oct 08 21:52:12 crc kubenswrapper[4669]: E1008 21:52:12.450977 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb\": container with ID starting with 642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb not found: ID does not exist" containerID="642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb" Oct 08 21:52:12 crc kubenswrapper[4669]: I1008 21:52:12.451017 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb"} err="failed to get container status \"642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb\": rpc error: code = NotFound desc = could not find container \"642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb\": container with ID starting with 642ed4bf8c45011a3a315640c4dea60f076c59bf0414f3ab27cd3120758562cb not found: ID does not exist" Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.185326 4669 patch_prober.go:28] interesting pod/machine-config-daemon-hw2kf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.185379 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.185419 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.186078 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd62feb27381ee2613e0b172d6c6d469c4cfe6dca045c2e24055cdd635324b18"} pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.186164 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" podUID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerName="machine-config-daemon" containerID="cri-o://fd62feb27381ee2613e0b172d6c6d469c4cfe6dca045c2e24055cdd635324b18" gracePeriod=600 Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.345825 4669 generic.go:334] "Generic (PLEG): container finished" podID="39c9bcf2-9580-4534-8c7e-886bd4aff469" containerID="fd62feb27381ee2613e0b172d6c6d469c4cfe6dca045c2e24055cdd635324b18" exitCode=0 Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.347809 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be6310c-20e0-4048-a5fe-16cb852489e4" path="/var/lib/kubelet/pods/4be6310c-20e0-4048-a5fe-16cb852489e4/volumes" Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.348752 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerDied","Data":"fd62feb27381ee2613e0b172d6c6d469c4cfe6dca045c2e24055cdd635324b18"} Oct 08 21:52:13 crc kubenswrapper[4669]: I1008 21:52:13.348800 4669 scope.go:117] "RemoveContainer" containerID="d99f2404f55700ee94ff2a27e207059d96fd41084a3b30b6aa74eb587d837286" Oct 08 21:52:14 crc kubenswrapper[4669]: I1008 21:52:14.363140 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw2kf" event={"ID":"39c9bcf2-9580-4534-8c7e-886bd4aff469","Type":"ContainerStarted","Data":"02c94e38be8cbd675fe1da8f8e7030305a797d3df55d91bca7a8da6b380bbe38"} Oct 08 21:52:23 crc kubenswrapper[4669]: I1008 21:52:23.474274 4669 generic.go:334] "Generic (PLEG): container finished" podID="ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9" containerID="8642cf7fbabbbea29d36ff683aa0d809d2795dab562c213e4e4c3ea8eac788a0" exitCode=0 Oct 08 21:52:23 crc kubenswrapper[4669]: I1008 21:52:23.474337 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" event={"ID":"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9","Type":"ContainerDied","Data":"8642cf7fbabbbea29d36ff683aa0d809d2795dab562c213e4e4c3ea8eac788a0"} Oct 08 21:52:23 crc kubenswrapper[4669]: I1008 21:52:23.475849 4669 scope.go:117] "RemoveContainer" containerID="8642cf7fbabbbea29d36ff683aa0d809d2795dab562c213e4e4c3ea8eac788a0" Oct 08 21:52:24 crc kubenswrapper[4669]: I1008 21:52:24.341658 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8t5fb_must-gather-z8mk6_ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9/gather/0.log" Oct 08 21:52:34 crc kubenswrapper[4669]: I1008 21:52:34.401932 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8t5fb/must-gather-z8mk6"] Oct 08 21:52:34 crc kubenswrapper[4669]: I1008 21:52:34.402846 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" podUID="ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9" containerName="copy" containerID="cri-o://697b1f9a0d37f0a8869cac32774d206cbe221fa08c2c9483b44988fd9e4b5154" gracePeriod=2 Oct 08 21:52:34 crc kubenswrapper[4669]: I1008 21:52:34.409600 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8t5fb/must-gather-z8mk6"] Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.619309 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8t5fb_must-gather-z8mk6_ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9/copy/0.log" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.620210 4669 generic.go:334] "Generic (PLEG): container finished" podID="ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9" containerID="697b1f9a0d37f0a8869cac32774d206cbe221fa08c2c9483b44988fd9e4b5154" exitCode=143 Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.798707 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8t5fb_must-gather-z8mk6_ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9/copy/0.log" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.799363 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.949184 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbwvd\" (UniqueName: \"kubernetes.io/projected/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-kube-api-access-pbwvd\") pod \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.949239 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-must-gather-output\") pod \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\" (UID: \"ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9\") " Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:34.956365 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-kube-api-access-pbwvd" (OuterVolumeSpecName: "kube-api-access-pbwvd") pod "ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9" (UID: "ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9"). InnerVolumeSpecName "kube-api-access-pbwvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.056121 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbwvd\" (UniqueName: \"kubernetes.io/projected/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-kube-api-access-pbwvd\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.122678 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9" (UID: "ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.157268 4669 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.374498 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9" path="/var/lib/kubelet/pods/ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9/volumes" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.629256 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8t5fb_must-gather-z8mk6_ebdc8e7e-dc3c-4c69-8e1c-bd2fc8e287e9/copy/0.log" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.630559 4669 scope.go:117] "RemoveContainer" containerID="697b1f9a0d37f0a8869cac32774d206cbe221fa08c2c9483b44988fd9e4b5154" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.630679 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8t5fb/must-gather-z8mk6" Oct 08 21:52:35 crc kubenswrapper[4669]: I1008 21:52:35.653687 4669 scope.go:117] "RemoveContainer" containerID="8642cf7fbabbbea29d36ff683aa0d809d2795dab562c213e4e4c3ea8eac788a0" Oct 08 21:52:53 crc kubenswrapper[4669]: I1008 21:52:53.772808 4669 trace.go:236] Trace[2134256438]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-t9x9z" (08-Oct-2025 21:52:52.767) (total time: 1005ms): Oct 08 21:52:53 crc kubenswrapper[4669]: Trace[2134256438]: [1.00555774s] [1.00555774s] END